Nov 29 00:35:36 np0005539509 kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 29 00:35:36 np0005539509 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 29 00:35:36 np0005539509 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:36 np0005539509 kernel: BIOS-provided physical RAM map:
Nov 29 00:35:36 np0005539509 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 29 00:35:36 np0005539509 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 29 00:35:36 np0005539509 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 29 00:35:36 np0005539509 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 29 00:35:36 np0005539509 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 29 00:35:36 np0005539509 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 29 00:35:36 np0005539509 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 29 00:35:36 np0005539509 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 29 00:35:36 np0005539509 kernel: NX (Execute Disable) protection: active
Nov 29 00:35:36 np0005539509 kernel: APIC: Static calls initialized
Nov 29 00:35:36 np0005539509 kernel: SMBIOS 2.8 present.
Nov 29 00:35:36 np0005539509 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 29 00:35:36 np0005539509 kernel: Hypervisor detected: KVM
Nov 29 00:35:36 np0005539509 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 29 00:35:36 np0005539509 kernel: kvm-clock: using sched offset of 3318926677 cycles
Nov 29 00:35:36 np0005539509 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 29 00:35:36 np0005539509 kernel: tsc: Detected 2799.998 MHz processor
Nov 29 00:35:36 np0005539509 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 29 00:35:36 np0005539509 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 29 00:35:36 np0005539509 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 29 00:35:36 np0005539509 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 29 00:35:36 np0005539509 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 29 00:35:36 np0005539509 kernel: Using GB pages for direct mapping
Nov 29 00:35:36 np0005539509 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 29 00:35:36 np0005539509 kernel: ACPI: Early table checksum verification disabled
Nov 29 00:35:36 np0005539509 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 29 00:35:36 np0005539509 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:36 np0005539509 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:36 np0005539509 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:36 np0005539509 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 29 00:35:36 np0005539509 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:36 np0005539509 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:36 np0005539509 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 29 00:35:36 np0005539509 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 29 00:35:36 np0005539509 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 29 00:35:36 np0005539509 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 29 00:35:36 np0005539509 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 29 00:35:36 np0005539509 kernel: No NUMA configuration found
Nov 29 00:35:36 np0005539509 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 29 00:35:36 np0005539509 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Nov 29 00:35:36 np0005539509 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 29 00:35:36 np0005539509 kernel: Zone ranges:
Nov 29 00:35:36 np0005539509 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 29 00:35:36 np0005539509 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 29 00:35:36 np0005539509 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 00:35:36 np0005539509 kernel:  Device   empty
Nov 29 00:35:36 np0005539509 kernel: Movable zone start for each node
Nov 29 00:35:36 np0005539509 kernel: Early memory node ranges
Nov 29 00:35:36 np0005539509 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 29 00:35:36 np0005539509 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 29 00:35:36 np0005539509 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 00:35:36 np0005539509 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 29 00:35:36 np0005539509 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 29 00:35:36 np0005539509 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 29 00:35:36 np0005539509 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 29 00:35:36 np0005539509 kernel: ACPI: PM-Timer IO Port: 0x608
Nov 29 00:35:36 np0005539509 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 29 00:35:36 np0005539509 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 29 00:35:36 np0005539509 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 29 00:35:36 np0005539509 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 29 00:35:36 np0005539509 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 29 00:35:36 np0005539509 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 29 00:35:36 np0005539509 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 29 00:35:36 np0005539509 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 29 00:35:36 np0005539509 kernel: TSC deadline timer available
Nov 29 00:35:36 np0005539509 kernel: CPU topo: Max. logical packages:   8
Nov 29 00:35:36 np0005539509 kernel: CPU topo: Max. logical dies:       8
Nov 29 00:35:36 np0005539509 kernel: CPU topo: Max. dies per package:   1
Nov 29 00:35:36 np0005539509 kernel: CPU topo: Max. threads per core:   1
Nov 29 00:35:36 np0005539509 kernel: CPU topo: Num. cores per package:     1
Nov 29 00:35:36 np0005539509 kernel: CPU topo: Num. threads per package:   1
Nov 29 00:35:36 np0005539509 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 29 00:35:36 np0005539509 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 29 00:35:36 np0005539509 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 29 00:35:36 np0005539509 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 29 00:35:36 np0005539509 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 29 00:35:36 np0005539509 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 29 00:35:36 np0005539509 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 29 00:35:36 np0005539509 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 29 00:35:36 np0005539509 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 29 00:35:36 np0005539509 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 29 00:35:36 np0005539509 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 29 00:35:36 np0005539509 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 29 00:35:36 np0005539509 kernel: Booting paravirtualized kernel on KVM
Nov 29 00:35:36 np0005539509 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 29 00:35:36 np0005539509 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 29 00:35:36 np0005539509 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 29 00:35:36 np0005539509 kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 29 00:35:36 np0005539509 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:36 np0005539509 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 29 00:35:36 np0005539509 kernel: random: crng init done
Nov 29 00:35:36 np0005539509 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: Fallback order for Node 0: 0 
Nov 29 00:35:36 np0005539509 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 29 00:35:36 np0005539509 kernel: Policy zone: Normal
Nov 29 00:35:36 np0005539509 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 29 00:35:36 np0005539509 kernel: software IO TLB: area num 8.
Nov 29 00:35:36 np0005539509 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 29 00:35:36 np0005539509 kernel: ftrace: allocating 49313 entries in 193 pages
Nov 29 00:35:36 np0005539509 kernel: ftrace: allocated 193 pages with 3 groups
Nov 29 00:35:36 np0005539509 kernel: Dynamic Preempt: voluntary
Nov 29 00:35:36 np0005539509 kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 29 00:35:36 np0005539509 kernel: rcu: #011RCU event tracing is enabled.
Nov 29 00:35:36 np0005539509 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 29 00:35:36 np0005539509 kernel: #011Trampoline variant of Tasks RCU enabled.
Nov 29 00:35:36 np0005539509 kernel: #011Rude variant of Tasks RCU enabled.
Nov 29 00:35:36 np0005539509 kernel: #011Tracing variant of Tasks RCU enabled.
Nov 29 00:35:36 np0005539509 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 29 00:35:36 np0005539509 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 29 00:35:36 np0005539509 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:36 np0005539509 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:36 np0005539509 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:36 np0005539509 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 29 00:35:36 np0005539509 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 29 00:35:36 np0005539509 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 29 00:35:36 np0005539509 kernel: Console: colour VGA+ 80x25
Nov 29 00:35:36 np0005539509 kernel: printk: console [ttyS0] enabled
Nov 29 00:35:36 np0005539509 kernel: ACPI: Core revision 20230331
Nov 29 00:35:36 np0005539509 kernel: APIC: Switch to symmetric I/O mode setup
Nov 29 00:35:36 np0005539509 kernel: x2apic enabled
Nov 29 00:35:36 np0005539509 kernel: APIC: Switched APIC routing to: physical x2apic
Nov 29 00:35:36 np0005539509 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 29 00:35:36 np0005539509 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 29 00:35:36 np0005539509 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 29 00:35:36 np0005539509 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 29 00:35:36 np0005539509 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 29 00:35:36 np0005539509 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 29 00:35:36 np0005539509 kernel: Spectre V2 : Mitigation: Retpolines
Nov 29 00:35:36 np0005539509 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 29 00:35:36 np0005539509 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 29 00:35:36 np0005539509 kernel: RETBleed: Mitigation: untrained return thunk
Nov 29 00:35:36 np0005539509 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 29 00:35:36 np0005539509 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 29 00:35:36 np0005539509 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 29 00:35:36 np0005539509 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 29 00:35:36 np0005539509 kernel: x86/bugs: return thunk changed
Nov 29 00:35:36 np0005539509 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 29 00:35:36 np0005539509 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 29 00:35:36 np0005539509 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 29 00:35:36 np0005539509 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 29 00:35:36 np0005539509 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 29 00:35:36 np0005539509 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 29 00:35:36 np0005539509 kernel: Freeing SMP alternatives memory: 40K
Nov 29 00:35:36 np0005539509 kernel: pid_max: default: 32768 minimum: 301
Nov 29 00:35:36 np0005539509 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 29 00:35:36 np0005539509 kernel: landlock: Up and running.
Nov 29 00:35:36 np0005539509 kernel: Yama: becoming mindful.
Nov 29 00:35:36 np0005539509 kernel: SELinux:  Initializing.
Nov 29 00:35:36 np0005539509 kernel: LSM support for eBPF active
Nov 29 00:35:36 np0005539509 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 29 00:35:36 np0005539509 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 29 00:35:36 np0005539509 kernel: ... version:                0
Nov 29 00:35:36 np0005539509 kernel: ... bit width:              48
Nov 29 00:35:36 np0005539509 kernel: ... generic registers:      6
Nov 29 00:35:36 np0005539509 kernel: ... value mask:             0000ffffffffffff
Nov 29 00:35:36 np0005539509 kernel: ... max period:             00007fffffffffff
Nov 29 00:35:36 np0005539509 kernel: ... fixed-purpose events:   0
Nov 29 00:35:36 np0005539509 kernel: ... event mask:             000000000000003f
Nov 29 00:35:36 np0005539509 kernel: signal: max sigframe size: 1776
Nov 29 00:35:36 np0005539509 kernel: rcu: Hierarchical SRCU implementation.
Nov 29 00:35:36 np0005539509 kernel: rcu: #011Max phase no-delay instances is 400.
Nov 29 00:35:36 np0005539509 kernel: smp: Bringing up secondary CPUs ...
Nov 29 00:35:36 np0005539509 kernel: smpboot: x86: Booting SMP configuration:
Nov 29 00:35:36 np0005539509 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 29 00:35:36 np0005539509 kernel: smp: Brought up 1 node, 8 CPUs
Nov 29 00:35:36 np0005539509 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 29 00:35:36 np0005539509 kernel: node 0 deferred pages initialised in 6ms
Nov 29 00:35:36 np0005539509 kernel: Memory: 7766056K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616276K reserved, 0K cma-reserved)
Nov 29 00:35:36 np0005539509 kernel: devtmpfs: initialized
Nov 29 00:35:36 np0005539509 kernel: x86/mm: Memory block size: 128MB
Nov 29 00:35:36 np0005539509 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 29 00:35:36 np0005539509 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: pinctrl core: initialized pinctrl subsystem
Nov 29 00:35:36 np0005539509 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 29 00:35:36 np0005539509 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 29 00:35:36 np0005539509 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 29 00:35:36 np0005539509 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 29 00:35:36 np0005539509 kernel: audit: initializing netlink subsys (disabled)
Nov 29 00:35:36 np0005539509 kernel: audit: type=2000 audit(1764394534.880:1): state=initialized audit_enabled=0 res=1
Nov 29 00:35:36 np0005539509 kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 29 00:35:36 np0005539509 kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 29 00:35:36 np0005539509 kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 29 00:35:36 np0005539509 kernel: cpuidle: using governor menu
Nov 29 00:35:36 np0005539509 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 29 00:35:36 np0005539509 kernel: PCI: Using configuration type 1 for base access
Nov 29 00:35:36 np0005539509 kernel: PCI: Using configuration type 1 for extended access
Nov 29 00:35:36 np0005539509 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 29 00:35:36 np0005539509 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 29 00:35:36 np0005539509 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 29 00:35:36 np0005539509 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 29 00:35:36 np0005539509 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 29 00:35:36 np0005539509 kernel: Demotion targets for Node 0: null
Nov 29 00:35:36 np0005539509 kernel: cryptd: max_cpu_qlen set to 1000
Nov 29 00:35:36 np0005539509 kernel: ACPI: Added _OSI(Module Device)
Nov 29 00:35:36 np0005539509 kernel: ACPI: Added _OSI(Processor Device)
Nov 29 00:35:36 np0005539509 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 29 00:35:36 np0005539509 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 29 00:35:36 np0005539509 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 29 00:35:36 np0005539509 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 29 00:35:36 np0005539509 kernel: ACPI: Interpreter enabled
Nov 29 00:35:36 np0005539509 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 29 00:35:36 np0005539509 kernel: ACPI: Using IOAPIC for interrupt routing
Nov 29 00:35:36 np0005539509 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 29 00:35:36 np0005539509 kernel: PCI: Using E820 reservations for host bridge windows
Nov 29 00:35:36 np0005539509 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 29 00:35:36 np0005539509 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 29 00:35:36 np0005539509 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [3] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [4] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [5] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [6] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [7] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [8] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [9] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [10] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [11] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [12] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [13] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [14] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [15] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [16] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [17] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [18] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [19] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [20] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [21] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [22] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [23] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [24] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [25] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [26] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [27] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [28] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [29] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [30] registered
Nov 29 00:35:36 np0005539509 kernel: acpiphp: Slot [31] registered
Nov 29 00:35:36 np0005539509 kernel: PCI host bridge to bus 0000:00
Nov 29 00:35:36 np0005539509 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 29 00:35:36 np0005539509 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 29 00:35:36 np0005539509 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 29 00:35:36 np0005539509 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 29 00:35:36 np0005539509 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 29 00:35:36 np0005539509 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 29 00:35:36 np0005539509 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 29 00:35:36 np0005539509 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 29 00:35:36 np0005539509 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 29 00:35:36 np0005539509 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 29 00:35:36 np0005539509 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 29 00:35:36 np0005539509 kernel: iommu: Default domain type: Translated
Nov 29 00:35:36 np0005539509 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 29 00:35:36 np0005539509 kernel: SCSI subsystem initialized
Nov 29 00:35:36 np0005539509 kernel: ACPI: bus type USB registered
Nov 29 00:35:36 np0005539509 kernel: usbcore: registered new interface driver usbfs
Nov 29 00:35:36 np0005539509 kernel: usbcore: registered new interface driver hub
Nov 29 00:35:36 np0005539509 kernel: usbcore: registered new device driver usb
Nov 29 00:35:36 np0005539509 kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 29 00:35:36 np0005539509 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 29 00:35:36 np0005539509 kernel: PTP clock support registered
Nov 29 00:35:36 np0005539509 kernel: EDAC MC: Ver: 3.0.0
Nov 29 00:35:36 np0005539509 kernel: NetLabel: Initializing
Nov 29 00:35:36 np0005539509 kernel: NetLabel:  domain hash size = 128
Nov 29 00:35:36 np0005539509 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 29 00:35:36 np0005539509 kernel: NetLabel:  unlabeled traffic allowed by default
Nov 29 00:35:36 np0005539509 kernel: PCI: Using ACPI for IRQ routing
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 29 00:35:36 np0005539509 kernel: vgaarb: loaded
Nov 29 00:35:36 np0005539509 kernel: clocksource: Switched to clocksource kvm-clock
Nov 29 00:35:36 np0005539509 kernel: VFS: Disk quotas dquot_6.6.0
Nov 29 00:35:36 np0005539509 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 29 00:35:36 np0005539509 kernel: pnp: PnP ACPI init
Nov 29 00:35:36 np0005539509 kernel: pnp: PnP ACPI: found 5 devices
Nov 29 00:35:36 np0005539509 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 29 00:35:36 np0005539509 kernel: NET: Registered PF_INET protocol family
Nov 29 00:35:36 np0005539509 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 29 00:35:36 np0005539509 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 00:35:36 np0005539509 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 29 00:35:36 np0005539509 kernel: NET: Registered PF_XDP protocol family
Nov 29 00:35:36 np0005539509 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 29 00:35:36 np0005539509 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 29 00:35:36 np0005539509 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 29 00:35:36 np0005539509 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 29 00:35:36 np0005539509 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 29 00:35:36 np0005539509 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 29 00:35:36 np0005539509 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 72424 usecs
Nov 29 00:35:36 np0005539509 kernel: PCI: CLS 0 bytes, default 64
Nov 29 00:35:36 np0005539509 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 29 00:35:36 np0005539509 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 29 00:35:36 np0005539509 kernel: ACPI: bus type thunderbolt registered
Nov 29 00:35:36 np0005539509 kernel: Trying to unpack rootfs image as initramfs...
Nov 29 00:35:36 np0005539509 kernel: Initialise system trusted keyrings
Nov 29 00:35:36 np0005539509 kernel: Key type blacklist registered
Nov 29 00:35:36 np0005539509 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 29 00:35:36 np0005539509 kernel: zbud: loaded
Nov 29 00:35:36 np0005539509 kernel: integrity: Platform Keyring initialized
Nov 29 00:35:36 np0005539509 kernel: integrity: Machine keyring initialized
Nov 29 00:35:36 np0005539509 kernel: Freeing initrd memory: 85868K
Nov 29 00:35:36 np0005539509 kernel: NET: Registered PF_ALG protocol family
Nov 29 00:35:36 np0005539509 kernel: xor: automatically using best checksumming function   avx       
Nov 29 00:35:36 np0005539509 kernel: Key type asymmetric registered
Nov 29 00:35:36 np0005539509 kernel: Asymmetric key parser 'x509' registered
Nov 29 00:35:36 np0005539509 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 29 00:35:36 np0005539509 kernel: io scheduler mq-deadline registered
Nov 29 00:35:36 np0005539509 kernel: io scheduler kyber registered
Nov 29 00:35:36 np0005539509 kernel: io scheduler bfq registered
Nov 29 00:35:36 np0005539509 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 29 00:35:36 np0005539509 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 29 00:35:36 np0005539509 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 29 00:35:36 np0005539509 kernel: ACPI: button: Power Button [PWRF]
Nov 29 00:35:36 np0005539509 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 29 00:35:36 np0005539509 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 29 00:35:36 np0005539509 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 29 00:35:36 np0005539509 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 29 00:35:36 np0005539509 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 29 00:35:36 np0005539509 kernel: Non-volatile memory driver v1.3
Nov 29 00:35:36 np0005539509 kernel: rdac: device handler registered
Nov 29 00:35:36 np0005539509 kernel: hp_sw: device handler registered
Nov 29 00:35:36 np0005539509 kernel: emc: device handler registered
Nov 29 00:35:36 np0005539509 kernel: alua: device handler registered
Nov 29 00:35:36 np0005539509 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 29 00:35:36 np0005539509 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 29 00:35:36 np0005539509 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 29 00:35:36 np0005539509 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 29 00:35:36 np0005539509 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 29 00:35:36 np0005539509 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 29 00:35:36 np0005539509 kernel: usb usb1: Product: UHCI Host Controller
Nov 29 00:35:36 np0005539509 kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 29 00:35:36 np0005539509 kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 29 00:35:36 np0005539509 kernel: hub 1-0:1.0: USB hub found
Nov 29 00:35:36 np0005539509 kernel: hub 1-0:1.0: 2 ports detected
Nov 29 00:35:36 np0005539509 kernel: usbcore: registered new interface driver usbserial_generic
Nov 29 00:35:36 np0005539509 kernel: usbserial: USB Serial support registered for generic
Nov 29 00:35:36 np0005539509 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 29 00:35:36 np0005539509 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 29 00:35:36 np0005539509 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 29 00:35:36 np0005539509 kernel: mousedev: PS/2 mouse device common for all mice
Nov 29 00:35:36 np0005539509 kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 29 00:35:36 np0005539509 kernel: rtc_cmos 00:04: registered as rtc0
Nov 29 00:35:36 np0005539509 kernel: rtc_cmos 00:04: setting system clock to 2025-11-29T05:35:35 UTC (1764394535)
Nov 29 00:35:36 np0005539509 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 29 00:35:36 np0005539509 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 29 00:35:36 np0005539509 kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 29 00:35:36 np0005539509 kernel: usbcore: registered new interface driver usbhid
Nov 29 00:35:36 np0005539509 kernel: usbhid: USB HID core driver
Nov 29 00:35:36 np0005539509 kernel: drop_monitor: Initializing network drop monitor service
Nov 29 00:35:36 np0005539509 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 29 00:35:36 np0005539509 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 29 00:35:36 np0005539509 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 29 00:35:36 np0005539509 kernel: Initializing XFRM netlink socket
Nov 29 00:35:36 np0005539509 kernel: NET: Registered PF_INET6 protocol family
Nov 29 00:35:36 np0005539509 kernel: Segment Routing with IPv6
Nov 29 00:35:36 np0005539509 kernel: NET: Registered PF_PACKET protocol family
Nov 29 00:35:36 np0005539509 kernel: mpls_gso: MPLS GSO support
Nov 29 00:35:36 np0005539509 kernel: IPI shorthand broadcast: enabled
Nov 29 00:35:36 np0005539509 kernel: AVX2 version of gcm_enc/dec engaged.
Nov 29 00:35:36 np0005539509 kernel: AES CTR mode by8 optimization enabled
Nov 29 00:35:36 np0005539509 kernel: sched_clock: Marking stable (1193005233, 147104155)->(1454713589, -114604201)
Nov 29 00:35:36 np0005539509 kernel: registered taskstats version 1
Nov 29 00:35:36 np0005539509 kernel: Loading compiled-in X.509 certificates
Nov 29 00:35:36 np0005539509 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 00:35:36 np0005539509 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 29 00:35:36 np0005539509 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 29 00:35:36 np0005539509 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 29 00:35:36 np0005539509 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 29 00:35:36 np0005539509 kernel: Demotion targets for Node 0: null
Nov 29 00:35:36 np0005539509 kernel: page_owner is disabled
Nov 29 00:35:36 np0005539509 kernel: Key type .fscrypt registered
Nov 29 00:35:36 np0005539509 kernel: Key type fscrypt-provisioning registered
Nov 29 00:35:36 np0005539509 kernel: Key type big_key registered
Nov 29 00:35:36 np0005539509 kernel: Key type encrypted registered
Nov 29 00:35:36 np0005539509 kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 29 00:35:36 np0005539509 kernel: Loading compiled-in module X.509 certificates
Nov 29 00:35:36 np0005539509 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 00:35:36 np0005539509 kernel: ima: Allocated hash algorithm: sha256
Nov 29 00:35:36 np0005539509 kernel: ima: No architecture policies found
Nov 29 00:35:36 np0005539509 kernel: evm: Initialising EVM extended attributes:
Nov 29 00:35:36 np0005539509 kernel: evm: security.selinux
Nov 29 00:35:36 np0005539509 kernel: evm: security.SMACK64 (disabled)
Nov 29 00:35:36 np0005539509 kernel: evm: security.SMACK64EXEC (disabled)
Nov 29 00:35:36 np0005539509 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 29 00:35:36 np0005539509 kernel: evm: security.SMACK64MMAP (disabled)
Nov 29 00:35:36 np0005539509 kernel: evm: security.apparmor (disabled)
Nov 29 00:35:36 np0005539509 kernel: evm: security.ima
Nov 29 00:35:36 np0005539509 kernel: evm: security.capability
Nov 29 00:35:36 np0005539509 kernel: evm: HMAC attrs: 0x1
Nov 29 00:35:36 np0005539509 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 29 00:35:36 np0005539509 kernel: Running certificate verification RSA selftest
Nov 29 00:35:36 np0005539509 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 29 00:35:36 np0005539509 kernel: Running certificate verification ECDSA selftest
Nov 29 00:35:36 np0005539509 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 29 00:35:36 np0005539509 kernel: clk: Disabling unused clocks
Nov 29 00:35:36 np0005539509 kernel: Freeing unused decrypted memory: 2028K
Nov 29 00:35:36 np0005539509 kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 29 00:35:36 np0005539509 kernel: Write protecting the kernel read-only data: 30720k
Nov 29 00:35:36 np0005539509 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 29 00:35:36 np0005539509 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 29 00:35:36 np0005539509 kernel: Run /init as init process
Nov 29 00:35:36 np0005539509 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 00:35:36 np0005539509 systemd: Detected virtualization kvm.
Nov 29 00:35:36 np0005539509 systemd: Detected architecture x86-64.
Nov 29 00:35:36 np0005539509 systemd: Running in initrd.
Nov 29 00:35:36 np0005539509 systemd: No hostname configured, using default hostname.
Nov 29 00:35:36 np0005539509 systemd: Hostname set to <localhost>.
Nov 29 00:35:36 np0005539509 systemd: Initializing machine ID from VM UUID.
Nov 29 00:35:36 np0005539509 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 29 00:35:36 np0005539509 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 29 00:35:36 np0005539509 kernel: usb 1-1: Product: QEMU USB Tablet
Nov 29 00:35:36 np0005539509 kernel: usb 1-1: Manufacturer: QEMU
Nov 29 00:35:36 np0005539509 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 29 00:35:36 np0005539509 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 29 00:35:36 np0005539509 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 29 00:35:36 np0005539509 systemd: Queued start job for default target Initrd Default Target.
Nov 29 00:35:36 np0005539509 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:36 np0005539509 systemd: Reached target Local Encrypted Volumes.
Nov 29 00:35:36 np0005539509 systemd: Reached target Initrd /usr File System.
Nov 29 00:35:36 np0005539509 systemd: Reached target Local File Systems.
Nov 29 00:35:36 np0005539509 systemd: Reached target Path Units.
Nov 29 00:35:36 np0005539509 systemd: Reached target Slice Units.
Nov 29 00:35:36 np0005539509 systemd: Reached target Swaps.
Nov 29 00:35:36 np0005539509 systemd: Reached target Timer Units.
Nov 29 00:35:36 np0005539509 systemd: Listening on D-Bus System Message Bus Socket.
Nov 29 00:35:36 np0005539509 systemd: Listening on Journal Socket (/dev/log).
Nov 29 00:35:36 np0005539509 systemd: Listening on Journal Socket.
Nov 29 00:35:36 np0005539509 systemd: Listening on udev Control Socket.
Nov 29 00:35:36 np0005539509 systemd: Listening on udev Kernel Socket.
Nov 29 00:35:36 np0005539509 systemd: Reached target Socket Units.
Nov 29 00:35:36 np0005539509 systemd: Starting Create List of Static Device Nodes...
Nov 29 00:35:36 np0005539509 systemd: Starting Journal Service...
Nov 29 00:35:36 np0005539509 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 00:35:36 np0005539509 systemd: Starting Apply Kernel Variables...
Nov 29 00:35:36 np0005539509 systemd: Starting Create System Users...
Nov 29 00:35:36 np0005539509 systemd: Starting Setup Virtual Console...
Nov 29 00:35:36 np0005539509 systemd: Finished Create List of Static Device Nodes.
Nov 29 00:35:36 np0005539509 systemd: Finished Apply Kernel Variables.
Nov 29 00:35:36 np0005539509 systemd: Finished Create System Users.
Nov 29 00:35:36 np0005539509 systemd: Starting Create Static Device Nodes in /dev...
Nov 29 00:35:36 np0005539509 systemd-journald[303]: Journal started
Nov 29 00:35:36 np0005539509 systemd-journald[303]: Runtime Journal (/run/log/journal/6289b14c9d0e4084a8992566f6eb59ac) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:36 np0005539509 systemd-sysusers[307]: Creating group 'users' with GID 100.
Nov 29 00:35:36 np0005539509 systemd-sysusers[307]: Creating group 'dbus' with GID 81.
Nov 29 00:35:36 np0005539509 systemd-sysusers[307]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 29 00:35:36 np0005539509 systemd: Started Journal Service.
Nov 29 00:35:36 np0005539509 systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 00:35:36 np0005539509 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 00:35:36 np0005539509 systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 00:35:36 np0005539509 systemd[1]: Finished Setup Virtual Console.
Nov 29 00:35:36 np0005539509 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 29 00:35:36 np0005539509 systemd[1]: Starting dracut cmdline hook...
Nov 29 00:35:36 np0005539509 dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Nov 29 00:35:36 np0005539509 dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:36 np0005539509 systemd[1]: Finished dracut cmdline hook.
Nov 29 00:35:36 np0005539509 systemd[1]: Starting dracut pre-udev hook...
Nov 29 00:35:36 np0005539509 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 29 00:35:36 np0005539509 kernel: device-mapper: uevent: version 1.0.3
Nov 29 00:35:36 np0005539509 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 29 00:35:36 np0005539509 kernel: RPC: Registered named UNIX socket transport module.
Nov 29 00:35:36 np0005539509 kernel: RPC: Registered udp transport module.
Nov 29 00:35:36 np0005539509 kernel: RPC: Registered tcp transport module.
Nov 29 00:35:36 np0005539509 kernel: RPC: Registered tcp-with-tls transport module.
Nov 29 00:35:36 np0005539509 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 29 00:35:36 np0005539509 rpc.statd[442]: Version 2.5.4 starting
Nov 29 00:35:36 np0005539509 rpc.statd[442]: Initializing NSM state
Nov 29 00:35:36 np0005539509 rpc.idmapd[447]: Setting log level to 0
Nov 29 00:35:36 np0005539509 systemd[1]: Finished dracut pre-udev hook.
Nov 29 00:35:36 np0005539509 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 00:35:36 np0005539509 systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 00:35:37 np0005539509 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 00:35:37 np0005539509 systemd[1]: Starting dracut pre-trigger hook...
Nov 29 00:35:37 np0005539509 systemd[1]: Finished dracut pre-trigger hook.
Nov 29 00:35:37 np0005539509 systemd[1]: Starting Coldplug All udev Devices...
Nov 29 00:35:37 np0005539509 systemd[1]: Created slice Slice /system/modprobe.
Nov 29 00:35:37 np0005539509 systemd[1]: Starting Load Kernel Module configfs...
Nov 29 00:35:37 np0005539509 systemd[1]: Finished Coldplug All udev Devices.
Nov 29 00:35:37 np0005539509 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:37 np0005539509 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:37 np0005539509 systemd[1]: Mounting Kernel Configuration File System...
Nov 29 00:35:37 np0005539509 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 00:35:37 np0005539509 systemd[1]: Reached target Network.
Nov 29 00:35:37 np0005539509 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 00:35:37 np0005539509 systemd[1]: Starting dracut initqueue hook...
Nov 29 00:35:37 np0005539509 systemd[1]: Mounted Kernel Configuration File System.
Nov 29 00:35:37 np0005539509 systemd[1]: Reached target System Initialization.
Nov 29 00:35:37 np0005539509 systemd[1]: Reached target Basic System.
Nov 29 00:35:37 np0005539509 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 29 00:35:37 np0005539509 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 29 00:35:37 np0005539509 kernel: vda: vda1
Nov 29 00:35:37 np0005539509 systemd-udevd[480]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:35:37 np0005539509 kernel: scsi host0: ata_piix
Nov 29 00:35:37 np0005539509 kernel: scsi host1: ata_piix
Nov 29 00:35:37 np0005539509 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 29 00:35:37 np0005539509 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 29 00:35:37 np0005539509 systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 00:35:37 np0005539509 systemd[1]: Reached target Initrd Root Device.
Nov 29 00:35:37 np0005539509 kernel: ata1: found unknown device (class 0)
Nov 29 00:35:37 np0005539509 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 29 00:35:37 np0005539509 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 29 00:35:37 np0005539509 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 29 00:35:37 np0005539509 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 29 00:35:37 np0005539509 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 29 00:35:37 np0005539509 systemd[1]: Finished dracut initqueue hook.
Nov 29 00:35:37 np0005539509 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 00:35:37 np0005539509 systemd[1]: Reached target Remote Encrypted Volumes.
Nov 29 00:35:37 np0005539509 systemd[1]: Reached target Remote File Systems.
Nov 29 00:35:37 np0005539509 systemd[1]: Starting dracut pre-mount hook...
Nov 29 00:35:37 np0005539509 systemd[1]: Finished dracut pre-mount hook.
Nov 29 00:35:37 np0005539509 systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Nov 29 00:35:37 np0005539509 systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Nov 29 00:35:37 np0005539509 systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 00:35:37 np0005539509 systemd[1]: Mounting /sysroot...
Nov 29 00:35:38 np0005539509 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 29 00:35:38 np0005539509 kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Nov 29 00:35:38 np0005539509 kernel: XFS (vda1): Ending clean mount
Nov 29 00:35:38 np0005539509 systemd[1]: Mounted /sysroot.
Nov 29 00:35:38 np0005539509 systemd[1]: Reached target Initrd Root File System.
Nov 29 00:35:38 np0005539509 systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 29 00:35:38 np0005539509 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 29 00:35:38 np0005539509 systemd[1]: Reached target Initrd File Systems.
Nov 29 00:35:38 np0005539509 systemd[1]: Reached target Initrd Default Target.
Nov 29 00:35:38 np0005539509 systemd[1]: Starting dracut mount hook...
Nov 29 00:35:38 np0005539509 systemd[1]: Finished dracut mount hook.
Nov 29 00:35:38 np0005539509 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 29 00:35:38 np0005539509 rpc.idmapd[447]: exiting on signal 15
Nov 29 00:35:38 np0005539509 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 29 00:35:38 np0005539509 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Network.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Timer Units.
Nov 29 00:35:38 np0005539509 systemd[1]: dbus.socket: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 29 00:35:38 np0005539509 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Initrd Default Target.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Basic System.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Initrd Root Device.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Initrd /usr File System.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Path Units.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Remote File Systems.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Slice Units.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Socket Units.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target System Initialization.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Local File Systems.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Swaps.
Nov 29 00:35:38 np0005539509 systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped dracut mount hook.
Nov 29 00:35:38 np0005539509 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped dracut pre-mount hook.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped target Local Encrypted Volumes.
Nov 29 00:35:38 np0005539509 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:38 np0005539509 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped dracut initqueue hook.
Nov 29 00:35:38 np0005539509 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped Apply Kernel Variables.
Nov 29 00:35:38 np0005539509 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped Create Volatile Files and Directories.
Nov 29 00:35:38 np0005539509 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped Coldplug All udev Devices.
Nov 29 00:35:38 np0005539509 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped dracut pre-trigger hook.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 29 00:35:38 np0005539509 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped Setup Virtual Console.
Nov 29 00:35:38 np0005539509 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 29 00:35:38 np0005539509 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 29 00:35:38 np0005539509 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Closed udev Control Socket.
Nov 29 00:35:38 np0005539509 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Closed udev Kernel Socket.
Nov 29 00:35:38 np0005539509 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped dracut pre-udev hook.
Nov 29 00:35:38 np0005539509 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped dracut cmdline hook.
Nov 29 00:35:38 np0005539509 systemd[1]: Starting Cleanup udev Database...
Nov 29 00:35:38 np0005539509 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 29 00:35:38 np0005539509 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped Create List of Static Device Nodes.
Nov 29 00:35:38 np0005539509 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Stopped Create System Users.
Nov 29 00:35:38 np0005539509 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 29 00:35:38 np0005539509 systemd[1]: Finished Cleanup udev Database.
Nov 29 00:35:38 np0005539509 systemd[1]: Reached target Switch Root.
Nov 29 00:35:38 np0005539509 systemd[1]: Starting Switch Root...
Nov 29 00:35:38 np0005539509 systemd[1]: Switching root.
Nov 29 00:35:38 np0005539509 systemd-journald[303]: Journal stopped
Nov 29 00:35:39 np0005539509 systemd-journald: Received SIGTERM from PID 1 (systemd).
Nov 29 00:35:39 np0005539509 kernel: audit: type=1404 audit(1764394539.051:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 29 00:35:39 np0005539509 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:35:39 np0005539509 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:35:39 np0005539509 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:35:39 np0005539509 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:35:39 np0005539509 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:35:39 np0005539509 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:35:39 np0005539509 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:35:39 np0005539509 kernel: audit: type=1403 audit(1764394539.178:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 29 00:35:39 np0005539509 systemd: Successfully loaded SELinux policy in 129.644ms.
Nov 29 00:35:39 np0005539509 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.293ms.
Nov 29 00:35:39 np0005539509 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 00:35:39 np0005539509 systemd: Detected virtualization kvm.
Nov 29 00:35:39 np0005539509 systemd: Detected architecture x86-64.
Nov 29 00:35:39 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:35:39 np0005539509 systemd: initrd-switch-root.service: Deactivated successfully.
Nov 29 00:35:39 np0005539509 systemd: Stopped Switch Root.
Nov 29 00:35:39 np0005539509 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 29 00:35:39 np0005539509 systemd: Created slice Slice /system/getty.
Nov 29 00:35:39 np0005539509 systemd: Created slice Slice /system/serial-getty.
Nov 29 00:35:39 np0005539509 systemd: Created slice Slice /system/sshd-keygen.
Nov 29 00:35:39 np0005539509 systemd: Created slice User and Session Slice.
Nov 29 00:35:39 np0005539509 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:39 np0005539509 systemd: Started Forward Password Requests to Wall Directory Watch.
Nov 29 00:35:39 np0005539509 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 29 00:35:39 np0005539509 systemd: Reached target Local Encrypted Volumes.
Nov 29 00:35:39 np0005539509 systemd: Stopped target Switch Root.
Nov 29 00:35:39 np0005539509 systemd: Stopped target Initrd File Systems.
Nov 29 00:35:39 np0005539509 systemd: Stopped target Initrd Root File System.
Nov 29 00:35:39 np0005539509 systemd: Reached target Local Integrity Protected Volumes.
Nov 29 00:35:39 np0005539509 systemd: Reached target Path Units.
Nov 29 00:35:39 np0005539509 systemd: Reached target rpc_pipefs.target.
Nov 29 00:35:39 np0005539509 systemd: Reached target Slice Units.
Nov 29 00:35:39 np0005539509 systemd: Reached target Swaps.
Nov 29 00:35:39 np0005539509 systemd: Reached target Local Verity Protected Volumes.
Nov 29 00:35:39 np0005539509 systemd: Listening on RPCbind Server Activation Socket.
Nov 29 00:35:39 np0005539509 systemd: Reached target RPC Port Mapper.
Nov 29 00:35:39 np0005539509 systemd: Listening on Process Core Dump Socket.
Nov 29 00:35:39 np0005539509 systemd: Listening on initctl Compatibility Named Pipe.
Nov 29 00:35:39 np0005539509 systemd: Listening on udev Control Socket.
Nov 29 00:35:39 np0005539509 systemd: Listening on udev Kernel Socket.
Nov 29 00:35:39 np0005539509 systemd: Mounting Huge Pages File System...
Nov 29 00:35:39 np0005539509 systemd: Mounting POSIX Message Queue File System...
Nov 29 00:35:39 np0005539509 systemd: Mounting Kernel Debug File System...
Nov 29 00:35:39 np0005539509 systemd: Mounting Kernel Trace File System...
Nov 29 00:35:39 np0005539509 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 00:35:39 np0005539509 systemd: Starting Create List of Static Device Nodes...
Nov 29 00:35:39 np0005539509 systemd: Starting Load Kernel Module configfs...
Nov 29 00:35:39 np0005539509 systemd: Starting Load Kernel Module drm...
Nov 29 00:35:39 np0005539509 systemd: Starting Load Kernel Module efi_pstore...
Nov 29 00:35:39 np0005539509 systemd: Starting Load Kernel Module fuse...
Nov 29 00:35:39 np0005539509 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 29 00:35:39 np0005539509 systemd: systemd-fsck-root.service: Deactivated successfully.
Nov 29 00:35:39 np0005539509 systemd: Stopped File System Check on Root Device.
Nov 29 00:35:39 np0005539509 systemd: Stopped Journal Service.
Nov 29 00:35:39 np0005539509 systemd: Starting Journal Service...
Nov 29 00:35:39 np0005539509 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 00:35:39 np0005539509 systemd: Starting Generate network units from Kernel command line...
Nov 29 00:35:39 np0005539509 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:39 np0005539509 systemd-journald[677]: Journal started
Nov 29 00:35:39 np0005539509 systemd-journald[677]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:39 np0005539509 systemd: Starting Remount Root and Kernel File Systems...
Nov 29 00:35:39 np0005539509 systemd[1]: Queued start job for default target Multi-User System.
Nov 29 00:35:39 np0005539509 systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 29 00:35:39 np0005539509 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 29 00:35:39 np0005539509 systemd: Starting Apply Kernel Variables...
Nov 29 00:35:39 np0005539509 systemd: Starting Coldplug All udev Devices...
Nov 29 00:35:39 np0005539509 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 29 00:35:39 np0005539509 systemd: Started Journal Service.
Nov 29 00:35:39 np0005539509 kernel: ACPI: bus type drm_connector registered
Nov 29 00:35:39 np0005539509 kernel: fuse: init (API version 7.37)
Nov 29 00:35:39 np0005539509 systemd[1]: Mounted Huge Pages File System.
Nov 29 00:35:39 np0005539509 systemd[1]: Mounted POSIX Message Queue File System.
Nov 29 00:35:39 np0005539509 systemd[1]: Mounted Kernel Debug File System.
Nov 29 00:35:39 np0005539509 systemd[1]: Mounted Kernel Trace File System.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Create List of Static Device Nodes.
Nov 29 00:35:39 np0005539509 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:39 np0005539509 systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Load Kernel Module drm.
Nov 29 00:35:39 np0005539509 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 29 00:35:39 np0005539509 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Load Kernel Module fuse.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Generate network units from Kernel command line.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Apply Kernel Variables.
Nov 29 00:35:39 np0005539509 systemd[1]: Mounting FUSE Control File System...
Nov 29 00:35:39 np0005539509 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 00:35:39 np0005539509 systemd[1]: Starting Rebuild Hardware Database...
Nov 29 00:35:39 np0005539509 systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 29 00:35:39 np0005539509 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 29 00:35:39 np0005539509 systemd[1]: Starting Load/Save OS Random Seed...
Nov 29 00:35:39 np0005539509 systemd[1]: Starting Create System Users...
Nov 29 00:35:39 np0005539509 systemd[1]: Mounted FUSE Control File System.
Nov 29 00:35:39 np0005539509 systemd-journald[677]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:39 np0005539509 systemd-journald[677]: Received client request to flush runtime journal.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Coldplug All udev Devices.
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Create System Users.
Nov 29 00:35:39 np0005539509 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 00:35:39 np0005539509 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 00:35:39 np0005539509 systemd[1]: Reached target Preparation for Local File Systems.
Nov 29 00:35:39 np0005539509 systemd[1]: Reached target Local File Systems.
Nov 29 00:35:39 np0005539509 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 29 00:35:39 np0005539509 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 29 00:35:39 np0005539509 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 29 00:35:39 np0005539509 systemd[1]: Starting Automatic Boot Loader Update...
Nov 29 00:35:39 np0005539509 systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 00:35:40 np0005539509 bootctl[694]: Couldn't find EFI system partition, skipping.
Nov 29 00:35:40 np0005539509 systemd[1]: Finished Automatic Boot Loader Update.
Nov 29 00:35:40 np0005539509 systemd[1]: Finished Load/Save OS Random Seed.
Nov 29 00:35:40 np0005539509 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 00:35:40 np0005539509 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 29 00:35:40 np0005539509 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 29 00:35:40 np0005539509 systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 00:35:40 np0005539509 systemd[1]: Starting Security Auditing Service...
Nov 29 00:35:40 np0005539509 systemd[1]: Starting RPC Bind...
Nov 29 00:35:40 np0005539509 systemd[1]: Starting Rebuild Journal Catalog...
Nov 29 00:35:40 np0005539509 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 29 00:35:40 np0005539509 auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 29 00:35:40 np0005539509 auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 29 00:35:40 np0005539509 systemd[1]: Finished Rebuild Journal Catalog.
Nov 29 00:35:40 np0005539509 systemd[1]: Started RPC Bind.
Nov 29 00:35:40 np0005539509 augenrules[706]: /sbin/augenrules: No change
Nov 29 00:35:40 np0005539509 augenrules[721]: No rules
Nov 29 00:35:40 np0005539509 augenrules[721]: enabled 1
Nov 29 00:35:40 np0005539509 augenrules[721]: failure 1
Nov 29 00:35:40 np0005539509 augenrules[721]: pid 701
Nov 29 00:35:40 np0005539509 augenrules[721]: rate_limit 0
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog_limit 8192
Nov 29 00:35:40 np0005539509 augenrules[721]: lost 0
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog 4
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog_wait_time 60000
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog_wait_time_actual 0
Nov 29 00:35:40 np0005539509 augenrules[721]: enabled 1
Nov 29 00:35:40 np0005539509 augenrules[721]: failure 1
Nov 29 00:35:40 np0005539509 augenrules[721]: pid 701
Nov 29 00:35:40 np0005539509 augenrules[721]: rate_limit 0
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog_limit 8192
Nov 29 00:35:40 np0005539509 augenrules[721]: lost 0
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog 4
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog_wait_time 60000
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog_wait_time_actual 0
Nov 29 00:35:40 np0005539509 augenrules[721]: enabled 1
Nov 29 00:35:40 np0005539509 augenrules[721]: failure 1
Nov 29 00:35:40 np0005539509 augenrules[721]: pid 701
Nov 29 00:35:40 np0005539509 augenrules[721]: rate_limit 0
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog_limit 8192
Nov 29 00:35:40 np0005539509 augenrules[721]: lost 0
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog 4
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog_wait_time 60000
Nov 29 00:35:40 np0005539509 augenrules[721]: backlog_wait_time_actual 0
Nov 29 00:35:40 np0005539509 systemd[1]: Started Security Auditing Service.
Nov 29 00:35:40 np0005539509 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 29 00:35:40 np0005539509 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 29 00:35:40 np0005539509 systemd[1]: Finished Rebuild Hardware Database.
Nov 29 00:35:40 np0005539509 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 00:35:40 np0005539509 systemd[1]: Starting Update is Completed...
Nov 29 00:35:40 np0005539509 systemd[1]: Finished Update is Completed.
Nov 29 00:35:40 np0005539509 systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 00:35:40 np0005539509 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 00:35:40 np0005539509 systemd[1]: Reached target System Initialization.
Nov 29 00:35:40 np0005539509 systemd[1]: Started dnf makecache --timer.
Nov 29 00:35:40 np0005539509 systemd[1]: Started Daily rotation of log files.
Nov 29 00:35:40 np0005539509 systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 29 00:35:40 np0005539509 systemd[1]: Reached target Timer Units.
Nov 29 00:35:40 np0005539509 systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 29 00:35:40 np0005539509 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 29 00:35:40 np0005539509 systemd[1]: Reached target Socket Units.
Nov 29 00:35:40 np0005539509 systemd-udevd[737]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:35:40 np0005539509 systemd[1]: Starting D-Bus System Message Bus...
Nov 29 00:35:40 np0005539509 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:40 np0005539509 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 29 00:35:40 np0005539509 systemd[1]: Starting Load Kernel Module configfs...
Nov 29 00:35:40 np0005539509 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:40 np0005539509 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:40 np0005539509 systemd[1]: Started D-Bus System Message Bus.
Nov 29 00:35:40 np0005539509 systemd[1]: Reached target Basic System.
Nov 29 00:35:40 np0005539509 dbus-broker-lau[768]: Ready
Nov 29 00:35:40 np0005539509 systemd[1]: Starting NTP client/server...
Nov 29 00:35:40 np0005539509 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 29 00:35:40 np0005539509 systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 29 00:35:40 np0005539509 systemd[1]: Starting IPv4 firewall with iptables...
Nov 29 00:35:40 np0005539509 systemd[1]: Started irqbalance daemon.
Nov 29 00:35:40 np0005539509 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 29 00:35:40 np0005539509 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 29 00:35:40 np0005539509 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 29 00:35:40 np0005539509 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 29 00:35:40 np0005539509 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 29 00:35:40 np0005539509 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:40 np0005539509 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:40 np0005539509 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:40 np0005539509 systemd[1]: Reached target sshd-keygen.target.
Nov 29 00:35:40 np0005539509 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 29 00:35:40 np0005539509 systemd[1]: Reached target User and Group Name Lookups.
Nov 29 00:35:40 np0005539509 systemd[1]: Starting User Login Management...
Nov 29 00:35:40 np0005539509 systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 29 00:35:40 np0005539509 chronyd[792]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 00:35:40 np0005539509 chronyd[792]: Loaded 0 symmetric keys
Nov 29 00:35:40 np0005539509 chronyd[792]: Using right/UTC timezone to obtain leap second data
Nov 29 00:35:40 np0005539509 chronyd[792]: Loaded seccomp filter (level 2)
Nov 29 00:35:40 np0005539509 systemd[1]: Started NTP client/server.
Nov 29 00:35:40 np0005539509 systemd-logind[785]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 00:35:40 np0005539509 systemd-logind[785]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 00:35:40 np0005539509 systemd-logind[785]: New seat seat0.
Nov 29 00:35:40 np0005539509 systemd[1]: Started User Login Management.
Nov 29 00:35:40 np0005539509 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 29 00:35:41 np0005539509 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 29 00:35:41 np0005539509 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 29 00:35:41 np0005539509 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 29 00:35:41 np0005539509 kernel: Console: switching to colour dummy device 80x25
Nov 29 00:35:41 np0005539509 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 29 00:35:41 np0005539509 kernel: [drm] features: -context_init
Nov 29 00:35:41 np0005539509 kernel: [drm] number of scanouts: 1
Nov 29 00:35:41 np0005539509 kernel: [drm] number of cap sets: 0
Nov 29 00:35:41 np0005539509 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 29 00:35:41 np0005539509 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 29 00:35:41 np0005539509 kernel: Console: switching to colour frame buffer device 128x48
Nov 29 00:35:41 np0005539509 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 29 00:35:41 np0005539509 kernel: kvm_amd: TSC scaling supported
Nov 29 00:35:41 np0005539509 kernel: kvm_amd: Nested Virtualization enabled
Nov 29 00:35:41 np0005539509 kernel: kvm_amd: Nested Paging enabled
Nov 29 00:35:41 np0005539509 kernel: kvm_amd: LBR virtualization supported
Nov 29 00:35:41 np0005539509 iptables.init[777]: iptables: Applying firewall rules: [  OK  ]
Nov 29 00:35:41 np0005539509 systemd[1]: Finished IPv4 firewall with iptables.
Nov 29 00:35:41 np0005539509 cloud-init[838]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 29 Nov 2025 05:35:41 +0000. Up 6.91 seconds.
Nov 29 00:35:41 np0005539509 systemd[1]: run-cloud\x2dinit-tmp-tmpfly6n2hr.mount: Deactivated successfully.
Nov 29 00:35:41 np0005539509 systemd[1]: Starting Hostname Service...
Nov 29 00:35:41 np0005539509 systemd[1]: Started Hostname Service.
Nov 29 00:35:41 np0005539509 systemd-hostnamed[852]: Hostname set to <np0005539509.novalocal> (static)
Nov 29 00:35:41 np0005539509 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 29 00:35:41 np0005539509 systemd[1]: Reached target Preparation for Network.
Nov 29 00:35:41 np0005539509 systemd[1]: Starting Network Manager...
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.7796] NetworkManager (version 1.54.1-1.el9) is starting... (boot:ba0dbca2-f496-4536-953e-379bfe5fc9e9)
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.7801] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.7873] manager[0x55d11b224080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.7903] hostname: hostname: using hostnamed
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.7904] hostname: static hostname changed from (none) to "np0005539509.novalocal"
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.7907] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.7990] manager[0x55d11b224080]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.7991] manager[0x55d11b224080]: rfkill: WWAN hardware radio set enabled
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8024] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8025] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8025] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8026] manager: Networking is enabled by state file
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8027] settings: Loaded settings plugin: keyfile (internal)
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8034] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8050] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8060] dhcp: init: Using DHCP client 'internal'
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8062] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8072] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:35:41 np0005539509 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8078] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8084] device (lo): Activation: starting connection 'lo' (e88f289b-57af-451c-b662-f7b3e0248e91)
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8090] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8092] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8117] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8122] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8124] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8125] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8128] device (eth0): carrier: link connected
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8130] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8136] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8141] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8145] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8146] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8148] manager: NetworkManager state is now CONNECTING
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8149] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8155] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8159] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:35:41 np0005539509 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:35:41 np0005539509 systemd[1]: Started Network Manager.
Nov 29 00:35:41 np0005539509 systemd[1]: Reached target Network.
Nov 29 00:35:41 np0005539509 systemd[1]: Starting Network Manager Wait Online...
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8403] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8406] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 00:35:41 np0005539509 NetworkManager[856]: <info>  [1764394541.8413] device (lo): Activation: successful, device activated.
Nov 29 00:35:41 np0005539509 systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 29 00:35:41 np0005539509 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:35:41 np0005539509 systemd[1]: Started GSSAPI Proxy Daemon.
Nov 29 00:35:41 np0005539509 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 00:35:41 np0005539509 systemd[1]: Reached target NFS client services.
Nov 29 00:35:41 np0005539509 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 00:35:41 np0005539509 systemd[1]: Reached target Remote File Systems.
Nov 29 00:35:41 np0005539509 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:42 np0005539509 NetworkManager[856]: <info>  [1764394542.9564] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 29 00:35:42 np0005539509 NetworkManager[856]: <info>  [1764394542.9578] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 00:35:42 np0005539509 NetworkManager[856]: <info>  [1764394542.9604] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 00:35:42 np0005539509 NetworkManager[856]: <info>  [1764394542.9642] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 00:35:42 np0005539509 NetworkManager[856]: <info>  [1764394542.9643] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 00:35:42 np0005539509 NetworkManager[856]: <info>  [1764394542.9646] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 00:35:42 np0005539509 NetworkManager[856]: <info>  [1764394542.9650] device (eth0): Activation: successful, device activated.
Nov 29 00:35:42 np0005539509 NetworkManager[856]: <info>  [1764394542.9655] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 00:35:42 np0005539509 NetworkManager[856]: <info>  [1764394542.9658] manager: startup complete
Nov 29 00:35:42 np0005539509 systemd[1]: Finished Network Manager Wait Online.
Nov 29 00:35:42 np0005539509 systemd[1]: Starting Cloud-init: Network Stage...
Nov 29 00:35:43 np0005539509 cloud-init[919]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 29 Nov 2025 05:35:43 +0000. Up 8.93 seconds.
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: |  eth0  | True |        38.102.83.204         | 255.255.255.0 | global | fa:16:3e:f4:b0:ce |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: |  eth0  | True | fe80::f816:3eff:fef4:b0ce/64 |       .       |  link  | fa:16:3e:f4:b0:ce |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 29 00:35:43 np0005539509 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:35:47 np0005539509 chronyd[792]: Selected source 162.159.200.123 (2.centos.pool.ntp.org)
Nov 29 00:35:49 np0005539509 chronyd[792]: System clock wrong by 1.179441 seconds
Nov 29 00:35:49 np0005539509 chronyd[792]: System clock was stepped by 1.179441 seconds
Nov 29 00:35:49 np0005539509 chronyd[792]: System clock TAI offset set to 37 seconds
Nov 29 00:35:52 np0005539509 irqbalance[781]: Cannot change IRQ 35 affinity: Operation not permitted
Nov 29 00:35:52 np0005539509 irqbalance[781]: IRQ 35 affinity is now unmanaged
Nov 29 00:35:52 np0005539509 irqbalance[781]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 29 00:35:52 np0005539509 irqbalance[781]: IRQ 25 affinity is now unmanaged
Nov 29 00:35:52 np0005539509 irqbalance[781]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 29 00:35:52 np0005539509 irqbalance[781]: IRQ 31 affinity is now unmanaged
Nov 29 00:35:52 np0005539509 irqbalance[781]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 29 00:35:52 np0005539509 irqbalance[781]: IRQ 26 affinity is now unmanaged
Nov 29 00:35:52 np0005539509 irqbalance[781]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 29 00:35:52 np0005539509 irqbalance[781]: IRQ 30 affinity is now unmanaged
Nov 29 00:35:52 np0005539509 irqbalance[781]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 29 00:35:52 np0005539509 irqbalance[781]: IRQ 29 affinity is now unmanaged
Nov 29 00:35:54 np0005539509 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:36:13 np0005539509 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 00:37:24 np0005539509 cloud-init[919]: Generating public/private rsa key pair.
Nov 29 00:37:24 np0005539509 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 29 00:37:24 np0005539509 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 29 00:37:24 np0005539509 cloud-init[919]: The key fingerprint is:
Nov 29 00:37:24 np0005539509 cloud-init[919]: SHA256:zHsTTn9kXEJ/sx6L0lH9r9QVsWcWY+VXHpZ+XptPw2E root@np0005539509.novalocal
Nov 29 00:37:24 np0005539509 cloud-init[919]: The key's randomart image is:
Nov 29 00:37:24 np0005539509 cloud-init[919]: +---[RSA 3072]----+
Nov 29 00:37:24 np0005539509 cloud-init[919]: |              .B=|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |             .o=O|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |              o=%|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |       o     ..E@|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |        S o  .*+O|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |         + o.o+*B|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |        . +..oo+=|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |         . ..o ..|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |              .  |
Nov 29 00:37:24 np0005539509 cloud-init[919]: +----[SHA256]-----+
Nov 29 00:37:24 np0005539509 cloud-init[919]: Generating public/private ecdsa key pair.
Nov 29 00:37:24 np0005539509 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 29 00:37:24 np0005539509 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 29 00:37:24 np0005539509 cloud-init[919]: The key fingerprint is:
Nov 29 00:37:24 np0005539509 cloud-init[919]: SHA256:hrB3mjrdZLTcMKbz+gfWCLe38DVhYocwMJUD7MjsnkY root@np0005539509.novalocal
Nov 29 00:37:24 np0005539509 cloud-init[919]: The key's randomart image is:
Nov 29 00:37:24 np0005539509 cloud-init[919]: +---[ECDSA 256]---+
Nov 29 00:37:24 np0005539509 cloud-init[919]: |     .++..       |
Nov 29 00:37:24 np0005539509 cloud-init[919]: |      ..=        |
Nov 29 00:37:24 np0005539509 cloud-init[919]: |   o.o   + .     |
Nov 29 00:37:24 np0005539509 cloud-init[919]: |    +oo.* + +    |
Nov 29 00:37:24 np0005539509 cloud-init[919]: |   .. oBSX + .   |
Nov 29 00:37:24 np0005539509 cloud-init[919]: |    E.o=@ + o    |
Nov 29 00:37:24 np0005539509 cloud-init[919]: |   o ooB = o .   |
Nov 29 00:37:24 np0005539509 cloud-init[919]: |    =.. o +      |
Nov 29 00:37:24 np0005539509 cloud-init[919]: |   ....o..       |
Nov 29 00:37:24 np0005539509 cloud-init[919]: +----[SHA256]-----+
Nov 29 00:37:24 np0005539509 cloud-init[919]: Generating public/private ed25519 key pair.
Nov 29 00:37:24 np0005539509 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 29 00:37:24 np0005539509 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 29 00:37:24 np0005539509 cloud-init[919]: The key fingerprint is:
Nov 29 00:37:24 np0005539509 cloud-init[919]: SHA256:Trsar2Pf1VIfL+mFrnewRWxstKDdw89MSvBE4PGb0GE root@np0005539509.novalocal
Nov 29 00:37:24 np0005539509 cloud-init[919]: The key's randomart image is:
Nov 29 00:37:24 np0005539509 cloud-init[919]: +--[ED25519 256]--+
Nov 29 00:37:24 np0005539509 cloud-init[919]: |            ooE  |
Nov 29 00:37:24 np0005539509 cloud-init[919]: |           ..*...|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |            =+*+.|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |           . ooBB|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |        S    .+Xo|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |       o .   oo+B|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |      . o   o =++|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |      oo o . +oo.|
Nov 29 00:37:24 np0005539509 cloud-init[919]: |     .o=+ . .oo. |
Nov 29 00:37:24 np0005539509 cloud-init[919]: +----[SHA256]-----+
Nov 29 00:37:24 np0005539509 systemd[1]: Finished Cloud-init: Network Stage.
Nov 29 00:37:24 np0005539509 systemd[1]: Reached target Cloud-config availability.
Nov 29 00:37:24 np0005539509 systemd[1]: Reached target Network is Online.
Nov 29 00:37:24 np0005539509 systemd[1]: Starting Cloud-init: Config Stage...
Nov 29 00:37:24 np0005539509 systemd[1]: Starting Crash recovery kernel arming...
Nov 29 00:37:24 np0005539509 systemd[1]: Starting Notify NFS peers of a restart...
Nov 29 00:37:24 np0005539509 systemd[1]: Starting System Logging Service...
Nov 29 00:37:24 np0005539509 systemd[1]: Starting OpenSSH server daemon...
Nov 29 00:37:24 np0005539509 sm-notify[1005]: Version 2.5.4 starting
Nov 29 00:37:24 np0005539509 systemd[1]: Starting Permit User Sessions...
Nov 29 00:37:24 np0005539509 systemd[1]: Started Notify NFS peers of a restart.
Nov 29 00:37:24 np0005539509 systemd[1]: Started OpenSSH server daemon.
Nov 29 00:37:24 np0005539509 systemd[1]: Finished Permit User Sessions.
Nov 29 00:37:24 np0005539509 systemd[1]: Started Command Scheduler.
Nov 29 00:37:24 np0005539509 rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Nov 29 00:37:24 np0005539509 rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 29 00:37:24 np0005539509 systemd[1]: Started Getty on tty1.
Nov 29 00:37:24 np0005539509 systemd[1]: Started Serial Getty on ttyS0.
Nov 29 00:37:24 np0005539509 systemd[1]: Reached target Login Prompts.
Nov 29 00:37:24 np0005539509 systemd[1]: Started System Logging Service.
Nov 29 00:37:24 np0005539509 systemd[1]: Reached target Multi-User System.
Nov 29 00:37:24 np0005539509 systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 29 00:37:24 np0005539509 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 29 00:37:24 np0005539509 systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 29 00:37:24 np0005539509 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 00:37:24 np0005539509 kdumpctl[1022]: kdump: No kdump initial ramdisk found.
Nov 29 00:37:24 np0005539509 kdumpctl[1022]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 29 00:37:25 np0005539509 cloud-init[1119]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 29 Nov 2025 05:37:25 +0000. Up 109.49 seconds.
Nov 29 00:37:25 np0005539509 systemd[1]: Finished Cloud-init: Config Stage.
Nov 29 00:37:25 np0005539509 systemd[1]: Starting Cloud-init: Final Stage...
Nov 29 00:37:25 np0005539509 dracut[1286]: dracut-057-102.git20250818.el9
Nov 29 00:37:25 np0005539509 cloud-init[1302]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 29 Nov 2025 05:37:25 +0000. Up 109.91 seconds.
Nov 29 00:37:25 np0005539509 cloud-init[1304]: #############################################################
Nov 29 00:37:25 np0005539509 cloud-init[1305]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 29 00:37:25 np0005539509 cloud-init[1307]: 256 SHA256:hrB3mjrdZLTcMKbz+gfWCLe38DVhYocwMJUD7MjsnkY root@np0005539509.novalocal (ECDSA)
Nov 29 00:37:25 np0005539509 cloud-init[1309]: 256 SHA256:Trsar2Pf1VIfL+mFrnewRWxstKDdw89MSvBE4PGb0GE root@np0005539509.novalocal (ED25519)
Nov 29 00:37:25 np0005539509 cloud-init[1311]: 3072 SHA256:zHsTTn9kXEJ/sx6L0lH9r9QVsWcWY+VXHpZ+XptPw2E root@np0005539509.novalocal (RSA)
Nov 29 00:37:25 np0005539509 cloud-init[1312]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 29 00:37:25 np0005539509 cloud-init[1313]: #############################################################
Nov 29 00:37:25 np0005539509 cloud-init[1302]: Cloud-init v. 24.4-7.el9 finished at Sat, 29 Nov 2025 05:37:25 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 110.10 seconds
Nov 29 00:37:25 np0005539509 dracut[1288]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 29 00:37:25 np0005539509 systemd[1]: Finished Cloud-init: Final Stage.
Nov 29 00:37:25 np0005539509 systemd[1]: Reached target Cloud-init target.
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: memstrack is not available
Nov 29 00:37:26 np0005539509 dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 00:37:26 np0005539509 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 00:37:27 np0005539509 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 00:37:27 np0005539509 dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 00:37:27 np0005539509 dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 00:37:27 np0005539509 dracut[1288]: memstrack is not available
Nov 29 00:37:27 np0005539509 dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 00:37:27 np0005539509 dracut[1288]: *** Including module: systemd ***
Nov 29 00:37:27 np0005539509 dracut[1288]: *** Including module: fips ***
Nov 29 00:37:27 np0005539509 dracut[1288]: *** Including module: systemd-initrd ***
Nov 29 00:37:27 np0005539509 dracut[1288]: *** Including module: i18n ***
Nov 29 00:37:27 np0005539509 dracut[1288]: *** Including module: drm ***
Nov 29 00:37:28 np0005539509 dracut[1288]: *** Including module: prefixdevname ***
Nov 29 00:37:28 np0005539509 dracut[1288]: *** Including module: kernel-modules ***
Nov 29 00:37:28 np0005539509 kernel: block vda: the capability attribute has been deprecated.
Nov 29 00:37:29 np0005539509 dracut[1288]: *** Including module: kernel-modules-extra ***
Nov 29 00:37:29 np0005539509 dracut[1288]: *** Including module: qemu ***
Nov 29 00:37:29 np0005539509 dracut[1288]: *** Including module: fstab-sys ***
Nov 29 00:37:29 np0005539509 dracut[1288]: *** Including module: rootfs-block ***
Nov 29 00:37:29 np0005539509 dracut[1288]: *** Including module: terminfo ***
Nov 29 00:37:29 np0005539509 dracut[1288]: *** Including module: udev-rules ***
Nov 29 00:37:29 np0005539509 dracut[1288]: Skipping udev rule: 91-permissions.rules
Nov 29 00:37:29 np0005539509 dracut[1288]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 29 00:37:29 np0005539509 dracut[1288]: *** Including module: virtiofs ***
Nov 29 00:37:30 np0005539509 dracut[1288]: *** Including module: dracut-systemd ***
Nov 29 00:37:30 np0005539509 dracut[1288]: *** Including module: usrmount ***
Nov 29 00:37:30 np0005539509 dracut[1288]: *** Including module: base ***
Nov 29 00:37:30 np0005539509 dracut[1288]: *** Including module: fs-lib ***
Nov 29 00:37:30 np0005539509 dracut[1288]: *** Including module: kdumpbase ***
Nov 29 00:37:31 np0005539509 dracut[1288]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 29 00:37:31 np0005539509 dracut[1288]:  microcode_ctl module: mangling fw_dir
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: configuration "intel" is ignored
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 29 00:37:31 np0005539509 dracut[1288]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 29 00:37:31 np0005539509 dracut[1288]: *** Including module: openssl ***
Nov 29 00:37:31 np0005539509 dracut[1288]: *** Including module: shutdown ***
Nov 29 00:37:31 np0005539509 dracut[1288]: *** Including module: squash ***
Nov 29 00:37:31 np0005539509 dracut[1288]: *** Including modules done ***
Nov 29 00:37:31 np0005539509 dracut[1288]: *** Installing kernel module dependencies ***
Nov 29 00:37:32 np0005539509 dracut[1288]: *** Installing kernel module dependencies done ***
Nov 29 00:37:32 np0005539509 dracut[1288]: *** Resolving executable dependencies ***
Nov 29 00:37:34 np0005539509 dracut[1288]: *** Resolving executable dependencies done ***
Nov 29 00:37:34 np0005539509 dracut[1288]: *** Generating early-microcode cpio image ***
Nov 29 00:37:34 np0005539509 dracut[1288]: *** Store current command line parameters ***
Nov 29 00:37:34 np0005539509 dracut[1288]: Stored kernel commandline:
Nov 29 00:37:34 np0005539509 dracut[1288]: No dracut internal kernel commandline stored in the initramfs
Nov 29 00:37:34 np0005539509 dracut[1288]: *** Install squash loader ***
Nov 29 00:37:35 np0005539509 dracut[1288]: *** Squashing the files inside the initramfs ***
Nov 29 00:37:36 np0005539509 dracut[1288]: *** Squashing the files inside the initramfs done ***
Nov 29 00:37:36 np0005539509 dracut[1288]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 29 00:37:36 np0005539509 dracut[1288]: *** Hardlinking files ***
Nov 29 00:37:36 np0005539509 dracut[1288]: *** Hardlinking files done ***
Nov 29 00:37:36 np0005539509 dracut[1288]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 29 00:37:37 np0005539509 kdumpctl[1022]: kdump: kexec: loaded kdump kernel
Nov 29 00:37:37 np0005539509 kdumpctl[1022]: kdump: Starting kdump: [OK]
Nov 29 00:37:37 np0005539509 systemd[1]: Finished Crash recovery kernel arming.
Nov 29 00:37:37 np0005539509 systemd[1]: Startup finished in 1.551s (kernel) + 3.149s (initrd) + 1min 57.285s (userspace) = 2min 1.986s.
Nov 29 00:37:40 np0005539509 systemd[1]: Created slice User Slice of UID 1000.
Nov 29 00:37:40 np0005539509 systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 29 00:37:40 np0005539509 systemd-logind[785]: New session 1 of user zuul.
Nov 29 00:37:40 np0005539509 systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 29 00:37:40 np0005539509 systemd[1]: Starting User Manager for UID 1000...
Nov 29 00:37:41 np0005539509 systemd[4300]: Queued start job for default target Main User Target.
Nov 29 00:37:41 np0005539509 systemd[4300]: Created slice User Application Slice.
Nov 29 00:37:41 np0005539509 systemd[4300]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 00:37:41 np0005539509 systemd[4300]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 00:37:41 np0005539509 systemd[4300]: Reached target Paths.
Nov 29 00:37:41 np0005539509 systemd[4300]: Reached target Timers.
Nov 29 00:37:41 np0005539509 systemd[4300]: Starting D-Bus User Message Bus Socket...
Nov 29 00:37:41 np0005539509 systemd[4300]: Starting Create User's Volatile Files and Directories...
Nov 29 00:37:41 np0005539509 systemd[4300]: Finished Create User's Volatile Files and Directories.
Nov 29 00:37:41 np0005539509 systemd[4300]: Listening on D-Bus User Message Bus Socket.
Nov 29 00:37:41 np0005539509 systemd[4300]: Reached target Sockets.
Nov 29 00:37:41 np0005539509 systemd[4300]: Reached target Basic System.
Nov 29 00:37:41 np0005539509 systemd[4300]: Reached target Main User Target.
Nov 29 00:37:41 np0005539509 systemd[4300]: Startup finished in 186ms.
Nov 29 00:37:41 np0005539509 systemd[1]: Started User Manager for UID 1000.
Nov 29 00:37:41 np0005539509 systemd[1]: Started Session 1 of User zuul.
Nov 29 00:37:41 np0005539509 python3[4383]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:37:45 np0005539509 python3[4411]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:37:52 np0005539509 python3[4469]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:37:53 np0005539509 python3[4509]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 29 00:37:55 np0005539509 python3[4535]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrxzXgpPmVv8+7+5w1Oy1RsXOPeqdxTcUlq37d0RcYulAAKXWla/qJwAX46v5xh/Mg4GnRpk77lvDWcVnOQjFYQg3OeLmFgDDNPV0YL7URmIe/MvgcqM+Kx7/SQjk+hEt7rUIqkFUjeREX60T5eTEMANFgJrljqZcBTMgYr67x4v7oFELzKuZIO0SCAprJ9NYmdRaC+DsjZjU+DuFdHBnfZCpgkTFMCda2FAS9BneAVOIMCBu5RgNVJXeAgIsPX9GNX3qDJMKOluQLOW++2gbue3S1Nrs1GMPm+IPRD4yWc9eZs1tpR1jdP1BEPBpyQRQlUn4z7BUdEogSzYiXCSmqzN1o/R3mdi16bG8e2lHve5MQFABPko8KsgVOJu0H7b7wGo/oGdXH7sdlKuGoWxWyTFcq3RcVkaVgjKtt6zeswkrpxMUv9/6NXPrhIWqdQm/wVw0Pv2p98yq10QRPyBv5yI8zcNjxueUl3aM8SZML87E6lhkUFFdAuVof+Sl5Pz8= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:37:55 np0005539509 python3[4559]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:37:56 np0005539509 python3[4658]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:37:56 np0005539509 python3[4729]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394676.1664808-252-165655715209557/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=601e897125784122ba5d7472ada57b1d_id_rsa follow=False checksum=5ac8bea8bfb8f348688bf24843ddb1285b2d351d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:37:57 np0005539509 python3[4852]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:37:57 np0005539509 python3[4923]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394677.1863534-307-11101954615071/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=601e897125784122ba5d7472ada57b1d_id_rsa.pub follow=False checksum=48b31d706687f3385690285b8caeaea67ea8286c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:37:59 np0005539509 python3[4971]: ansible-ping Invoked with data=pong
Nov 29 00:38:00 np0005539509 python3[4995]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:38:02 np0005539509 python3[5053]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 29 00:38:03 np0005539509 python3[5085]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:04 np0005539509 python3[5109]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:04 np0005539509 python3[5133]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:05 np0005539509 python3[5157]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:05 np0005539509 python3[5181]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:05 np0005539509 python3[5205]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:07 np0005539509 python3[5231]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:08 np0005539509 python3[5309]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:38:08 np0005539509 python3[5382]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394687.580609-32-113287568176534/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:09 np0005539509 python3[5430]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:09 np0005539509 python3[5454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:09 np0005539509 python3[5478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:10 np0005539509 python3[5502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:10 np0005539509 python3[5526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:10 np0005539509 python3[5550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:10 np0005539509 python3[5574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:11 np0005539509 python3[5598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:11 np0005539509 python3[5622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:11 np0005539509 python3[5646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:12 np0005539509 python3[5670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:12 np0005539509 python3[5694]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:12 np0005539509 python3[5718]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:13 np0005539509 python3[5742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:13 np0005539509 python3[5766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:13 np0005539509 python3[5790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:13 np0005539509 python3[5814]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:14 np0005539509 python3[5838]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:14 np0005539509 python3[5862]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:14 np0005539509 python3[5886]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:15 np0005539509 python3[5910]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:15 np0005539509 python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:15 np0005539509 python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:16 np0005539509 python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:16 np0005539509 python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:16 np0005539509 python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:19 np0005539509 python3[6056]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 00:38:19 np0005539509 systemd[1]: Starting Time & Date Service...
Nov 29 00:38:19 np0005539509 systemd[1]: Started Time & Date Service.
Nov 29 00:38:19 np0005539509 systemd-timedated[6058]: Changed time zone to 'UTC' (UTC).
Nov 29 00:38:19 np0005539509 python3[6087]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:20 np0005539509 python3[6163]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:38:20 np0005539509 python3[6234]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764394700.03186-252-178861774041832/source _original_basename=tmpydqc7bn4 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:21 np0005539509 python3[6334]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:38:21 np0005539509 python3[6405]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764394701.1508152-303-221719467209209/source _original_basename=tmpw95s2vyv follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:22 np0005539509 python3[6507]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:38:23 np0005539509 python3[6580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764394702.4281366-382-15652868957253/source _original_basename=tmp6ffexe8a follow=False checksum=01954034105cdb65b42722894a5c1036808c70c7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:23 np0005539509 python3[6628]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:38:24 np0005539509 python3[6654]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:38:24 np0005539509 python3[6734]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:38:25 np0005539509 python3[6807]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394704.3946762-452-211574357919580/source _original_basename=tmpy78nwvio follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:25 np0005539509 python3[6858]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-3d5b-5bb0-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:38:26 np0005539509 python3[6886]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-3d5b-5bb0-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 29 00:38:27 np0005539509 python3[6914]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:46 np0005539509 python3[6940]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:49 np0005539509 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 00:39:46 np0005539509 systemd-logind[785]: Session 1 logged out. Waiting for processes to exit.
Nov 29 00:39:53 np0005539509 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 00:39:53 np0005539509 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 29 00:39:53 np0005539509 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 29 00:39:53 np0005539509 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 29 00:39:53 np0005539509 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 29 00:39:53 np0005539509 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 29 00:39:53 np0005539509 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 29 00:39:53 np0005539509 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 29 00:39:53 np0005539509 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 29 00:39:53 np0005539509 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 29 00:39:53 np0005539509 NetworkManager[856]: <info>  [1764394793.1347] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 00:39:53 np0005539509 systemd-udevd[6945]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:39:53 np0005539509 NetworkManager[856]: <info>  [1764394793.1556] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 00:39:53 np0005539509 NetworkManager[856]: <info>  [1764394793.1590] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 29 00:39:53 np0005539509 NetworkManager[856]: <info>  [1764394793.1596] device (eth1): carrier: link connected
Nov 29 00:39:53 np0005539509 NetworkManager[856]: <info>  [1764394793.1598] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 00:39:53 np0005539509 NetworkManager[856]: <info>  [1764394793.1606] policy: auto-activating connection 'Wired connection 1' (869e6d79-5f7b-3b9f-b76e-078155d890a2)
Nov 29 00:39:53 np0005539509 NetworkManager[856]: <info>  [1764394793.1611] device (eth1): Activation: starting connection 'Wired connection 1' (869e6d79-5f7b-3b9f-b76e-078155d890a2)
Nov 29 00:39:53 np0005539509 NetworkManager[856]: <info>  [1764394793.1612] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:39:53 np0005539509 NetworkManager[856]: <info>  [1764394793.1615] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:39:53 np0005539509 NetworkManager[856]: <info>  [1764394793.1620] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:39:53 np0005539509 NetworkManager[856]: <info>  [1764394793.1625] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:39:53 np0005539509 systemd[4300]: Starting Mark boot as successful...
Nov 29 00:39:53 np0005539509 systemd[4300]: Finished Mark boot as successful.
Nov 29 00:39:54 np0005539509 systemd-logind[785]: New session 3 of user zuul.
Nov 29 00:39:54 np0005539509 systemd[1]: Started Session 3 of User zuul.
Nov 29 00:39:54 np0005539509 python3[6977]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-4e5a-44df-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:40:04 np0005539509 python3[7057]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:40:04 np0005539509 python3[7130]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394804.1989338-155-88084161914430/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=c1fba3f03f63934d2121e957385cfe4c48be3062 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:40:05 np0005539509 python3[7180]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 00:40:05 np0005539509 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 00:40:05 np0005539509 systemd[1]: Stopped Network Manager Wait Online.
Nov 29 00:40:05 np0005539509 systemd[1]: Stopping Network Manager Wait Online...
Nov 29 00:40:05 np0005539509 systemd[1]: Stopping Network Manager...
Nov 29 00:40:05 np0005539509 NetworkManager[856]: <info>  [1764394805.5969] caught SIGTERM, shutting down normally.
Nov 29 00:40:05 np0005539509 NetworkManager[856]: <info>  [1764394805.5979] dhcp4 (eth0): canceled DHCP transaction
Nov 29 00:40:05 np0005539509 NetworkManager[856]: <info>  [1764394805.5979] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:40:05 np0005539509 NetworkManager[856]: <info>  [1764394805.5979] dhcp4 (eth0): state changed no lease
Nov 29 00:40:05 np0005539509 NetworkManager[856]: <info>  [1764394805.5981] manager: NetworkManager state is now CONNECTING
Nov 29 00:40:05 np0005539509 NetworkManager[856]: <info>  [1764394805.6083] dhcp4 (eth1): canceled DHCP transaction
Nov 29 00:40:05 np0005539509 NetworkManager[856]: <info>  [1764394805.6084] dhcp4 (eth1): state changed no lease
Nov 29 00:40:05 np0005539509 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:40:05 np0005539509 NetworkManager[856]: <info>  [1764394805.6178] exiting (success)
Nov 29 00:40:05 np0005539509 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:40:05 np0005539509 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 00:40:05 np0005539509 systemd[1]: Stopped Network Manager.
Nov 29 00:40:05 np0005539509 systemd[1]: NetworkManager.service: Consumed 2.017s CPU time, 9.9M memory peak.
Nov 29 00:40:05 np0005539509 systemd[1]: Starting Network Manager...
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.6967] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:ba0dbca2-f496-4536-953e-379bfe5fc9e9)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.6969] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7025] manager[0x55c7c3208070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 00:40:05 np0005539509 systemd[1]: Starting Hostname Service...
Nov 29 00:40:05 np0005539509 systemd[1]: Started Hostname Service.
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7745] hostname: hostname: using hostnamed
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7746] hostname: static hostname changed from (none) to "np0005539509.novalocal"
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7754] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7759] manager[0x55c7c3208070]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7760] manager[0x55c7c3208070]: rfkill: WWAN hardware radio set enabled
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7790] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7791] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7792] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7792] manager: Networking is enabled by state file
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7795] settings: Loaded settings plugin: keyfile (internal)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7800] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7828] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7838] dhcp: init: Using DHCP client 'internal'
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7840] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7846] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7852] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7861] device (lo): Activation: starting connection 'lo' (e88f289b-57af-451c-b662-f7b3e0248e91)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7869] device (eth0): carrier: link connected
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7874] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7878] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7879] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7884] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7890] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7896] device (eth1): carrier: link connected
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7900] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7905] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (869e6d79-5f7b-3b9f-b76e-078155d890a2) (indicated)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7906] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7911] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7917] device (eth1): Activation: starting connection 'Wired connection 1' (869e6d79-5f7b-3b9f-b76e-078155d890a2)
Nov 29 00:40:05 np0005539509 systemd[1]: Started Network Manager.
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7931] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7940] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7945] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7948] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7953] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7959] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7964] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7970] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7976] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7991] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.7996] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.8011] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.8015] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:40:05 np0005539509 systemd[1]: Starting Network Manager Wait Online...
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.8047] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.8053] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.8064] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.8076] device (lo): Activation: successful, device activated.
Nov 29 00:40:05 np0005539509 NetworkManager[7192]: <info>  [1764394805.8094] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 00:40:06 np0005539509 NetworkManager[7192]: <info>  [1764394806.0200] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 00:40:06 np0005539509 NetworkManager[7192]: <info>  [1764394806.0239] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 00:40:06 np0005539509 NetworkManager[7192]: <info>  [1764394806.0268] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 00:40:06 np0005539509 NetworkManager[7192]: <info>  [1764394806.0273] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 00:40:06 np0005539509 NetworkManager[7192]: <info>  [1764394806.0277] device (eth0): Activation: successful, device activated.
Nov 29 00:40:06 np0005539509 NetworkManager[7192]: <info>  [1764394806.0285] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 00:40:06 np0005539509 python3[7245]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-4e5a-44df-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:40:16 np0005539509 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:40:35 np0005539509 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5285] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 00:40:51 np0005539509 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:40:51 np0005539509 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5659] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5664] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5675] device (eth1): Activation: successful, device activated.
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5685] manager: startup complete
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5689] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <warn>  [1764394851.5699] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5712] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 29 00:40:51 np0005539509 systemd[1]: Finished Network Manager Wait Online.
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5824] dhcp4 (eth1): canceled DHCP transaction
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5825] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5825] dhcp4 (eth1): state changed no lease
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5850] policy: auto-activating connection 'ci-private-network' (f58d442a-350a-5956-a954-8dae41cac9cb)
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5857] device (eth1): Activation: starting connection 'ci-private-network' (f58d442a-350a-5956-a954-8dae41cac9cb)
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5858] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5863] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5874] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:40:51 np0005539509 NetworkManager[7192]: <info>  [1764394851.5886] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 00:40:52 np0005539509 NetworkManager[7192]: <info>  [1764394852.3898] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 00:40:52 np0005539509 NetworkManager[7192]: <info>  [1764394852.3903] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 00:40:52 np0005539509 NetworkManager[7192]: <info>  [1764394852.3915] device (eth1): Activation: successful, device activated.
Nov 29 00:41:02 np0005539509 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:41:06 np0005539509 systemd[1]: session-3.scope: Deactivated successfully.
Nov 29 00:41:06 np0005539509 systemd[1]: session-3.scope: Consumed 1.856s CPU time.
Nov 29 00:41:06 np0005539509 systemd-logind[785]: Session 3 logged out. Waiting for processes to exit.
Nov 29 00:41:06 np0005539509 systemd-logind[785]: Removed session 3.
Nov 29 00:41:44 np0005539509 systemd-logind[785]: New session 4 of user zuul.
Nov 29 00:41:44 np0005539509 systemd[1]: Started Session 4 of User zuul.
Nov 29 00:41:45 np0005539509 python3[7375]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:41:45 np0005539509 python3[7448]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394904.7427914-373-147243090404980/source _original_basename=tmp38mmtp3x follow=False checksum=95c43167cb69fbe3f3b9eff0c3ecf63c2bbd5b70 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:41:48 np0005539509 systemd[1]: session-4.scope: Deactivated successfully.
Nov 29 00:41:48 np0005539509 systemd-logind[785]: Session 4 logged out. Waiting for processes to exit.
Nov 29 00:41:48 np0005539509 systemd-logind[785]: Removed session 4.
Nov 29 00:43:05 np0005539509 systemd[4300]: Created slice User Background Tasks Slice.
Nov 29 00:43:05 np0005539509 systemd[4300]: Starting Cleanup of User's Temporary Files and Directories...
Nov 29 00:43:05 np0005539509 systemd[4300]: Finished Cleanup of User's Temporary Files and Directories.
Nov 29 00:47:03 np0005539509 systemd-logind[785]: New session 5 of user zuul.
Nov 29 00:47:03 np0005539509 systemd[1]: Started Session 5 of User zuul.
Nov 29 00:47:03 np0005539509 python3[7512]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-b110-1686-000000000ca2-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:04 np0005539509 python3[7541]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:04 np0005539509 python3[7567]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:04 np0005539509 python3[7593]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:05 np0005539509 python3[7619]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:05 np0005539509 python3[7645]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:06 np0005539509 python3[7723]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:06 np0005539509 python3[7796]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395225.862213-366-176894584212579/source _original_basename=tmp9avxdlxn follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:07 np0005539509 python3[7846]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 00:47:07 np0005539509 systemd[1]: Reloading.
Nov 29 00:47:07 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:47:09 np0005539509 python3[7902]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 29 00:47:10 np0005539509 python3[7928]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:10 np0005539509 python3[7956]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:10 np0005539509 python3[7984]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:11 np0005539509 python3[8012]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:11 np0005539509 python3[8039]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-b110-1686-000000000ca9-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:12 np0005539509 python3[8069]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 29 00:47:15 np0005539509 systemd[1]: session-5.scope: Deactivated successfully.
Nov 29 00:47:15 np0005539509 systemd[1]: session-5.scope: Consumed 4.730s CPU time.
Nov 29 00:47:15 np0005539509 systemd-logind[785]: Session 5 logged out. Waiting for processes to exit.
Nov 29 00:47:15 np0005539509 systemd-logind[785]: Removed session 5.
Nov 29 00:47:16 np0005539509 systemd-logind[785]: New session 6 of user zuul.
Nov 29 00:47:16 np0005539509 systemd[1]: Started Session 6 of User zuul.
Nov 29 00:47:17 np0005539509 python3[8102]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 29 00:47:30 np0005539509 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:47:30 np0005539509 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:47:30 np0005539509 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:47:30 np0005539509 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:47:30 np0005539509 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:47:30 np0005539509 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:47:30 np0005539509 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:47:30 np0005539509 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:47:39 np0005539509 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:47:39 np0005539509 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:47:39 np0005539509 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:47:39 np0005539509 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:47:39 np0005539509 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:47:39 np0005539509 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:47:39 np0005539509 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:47:39 np0005539509 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:47:48 np0005539509 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:47:48 np0005539509 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:47:48 np0005539509 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:47:48 np0005539509 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:47:48 np0005539509 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:47:48 np0005539509 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:47:48 np0005539509 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:47:48 np0005539509 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:47:49 np0005539509 setsebool[8169]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 29 00:47:49 np0005539509 setsebool[8169]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 29 00:48:00 np0005539509 kernel: SELinux:  Converting 388 SID table entries...
Nov 29 00:48:00 np0005539509 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:48:00 np0005539509 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:48:00 np0005539509 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:48:00 np0005539509 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:48:00 np0005539509 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:48:00 np0005539509 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:48:00 np0005539509 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:48:18 np0005539509 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 00:48:18 np0005539509 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 00:48:18 np0005539509 systemd[1]: Starting man-db-cache-update.service...
Nov 29 00:48:18 np0005539509 systemd[1]: Reloading.
Nov 29 00:48:18 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:48:18 np0005539509 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 00:48:24 np0005539509 python3[12661]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-4d52-d96a-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:48:25 np0005539509 kernel: evm: overlay not supported
Nov 29 00:48:25 np0005539509 systemd[4300]: Starting D-Bus User Message Bus...
Nov 29 00:48:25 np0005539509 dbus-broker-launch[13389]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 29 00:48:25 np0005539509 dbus-broker-launch[13389]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 29 00:48:25 np0005539509 systemd[4300]: Started D-Bus User Message Bus.
Nov 29 00:48:25 np0005539509 dbus-broker-lau[13389]: Ready
Nov 29 00:48:25 np0005539509 systemd[4300]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 00:48:25 np0005539509 systemd[4300]: Created slice Slice /user.
Nov 29 00:48:25 np0005539509 systemd[4300]: podman-13294.scope: unit configures an IP firewall, but not running as root.
Nov 29 00:48:25 np0005539509 systemd[4300]: (This warning is only shown for the first unit using IP firewalling.)
Nov 29 00:48:25 np0005539509 systemd[4300]: Started podman-13294.scope.
Nov 29 00:48:25 np0005539509 systemd[4300]: Started podman-pause-5b7b0f92.scope.
Nov 29 00:48:26 np0005539509 python3[13916]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.97:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.97:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:48:26 np0005539509 python3[13916]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 29 00:48:26 np0005539509 systemd[1]: session-6.scope: Deactivated successfully.
Nov 29 00:48:26 np0005539509 systemd[1]: session-6.scope: Consumed 59.272s CPU time.
Nov 29 00:48:26 np0005539509 systemd-logind[785]: Session 6 logged out. Waiting for processes to exit.
Nov 29 00:48:26 np0005539509 systemd-logind[785]: Removed session 6.
Nov 29 00:48:52 np0005539509 systemd-logind[785]: New session 7 of user zuul.
Nov 29 00:48:52 np0005539509 systemd[1]: Started Session 7 of User zuul.
Nov 29 00:48:52 np0005539509 irqbalance[781]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 29 00:48:52 np0005539509 irqbalance[781]: IRQ 27 affinity is now unmanaged
Nov 29 00:48:52 np0005539509 python3[22637]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:48:53 np0005539509 python3[22826]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:48:54 np0005539509 python3[23128]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005539509.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 29 00:48:54 np0005539509 python3[23326]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:48:55 np0005539509 python3[23565]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:48:55 np0005539509 python3[23786]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395335.1038623-168-10431261003854/source _original_basename=tmpnk1t0ssr follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:48:56 np0005539509 python3[24095]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Nov 29 00:48:56 np0005539509 systemd[1]: Starting Hostname Service...
Nov 29 00:48:56 np0005539509 systemd[1]: Started Hostname Service.
Nov 29 00:48:57 np0005539509 systemd-hostnamed[24185]: Changed pretty hostname to 'compute-1'
Nov 29 00:48:57 np0005539509 systemd-hostnamed[24185]: Hostname set to <compute-1> (static)
Nov 29 00:48:57 np0005539509 NetworkManager[7192]: <info>  [1764395337.0309] hostname: static hostname changed from "np0005539509.novalocal" to "compute-1"
Nov 29 00:48:57 np0005539509 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:48:57 np0005539509 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:48:57 np0005539509 systemd[1]: session-7.scope: Deactivated successfully.
Nov 29 00:48:57 np0005539509 systemd[1]: session-7.scope: Consumed 2.720s CPU time.
Nov 29 00:48:57 np0005539509 systemd-logind[785]: Session 7 logged out. Waiting for processes to exit.
Nov 29 00:48:57 np0005539509 systemd-logind[785]: Removed session 7.
Nov 29 00:49:07 np0005539509 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:49:15 np0005539509 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 00:49:15 np0005539509 systemd[1]: Finished man-db-cache-update.service.
Nov 29 00:49:15 np0005539509 systemd[1]: man-db-cache-update.service: Consumed 1min 7.734s CPU time.
Nov 29 00:49:15 np0005539509 systemd[1]: run-r02812b7a0b4548359c0b36bd756f2b3e.service: Deactivated successfully.
Nov 29 00:49:27 np0005539509 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 00:50:55 np0005539509 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 29 00:50:55 np0005539509 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 29 00:50:55 np0005539509 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 29 00:50:55 np0005539509 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 29 00:53:01 np0005539509 systemd-logind[785]: New session 8 of user zuul.
Nov 29 00:53:01 np0005539509 systemd[1]: Started Session 8 of User zuul.
Nov 29 00:53:02 np0005539509 python3[30012]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:53:04 np0005539509 python3[30128]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:04 np0005539509 python3[30201]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=delorean.repo follow=False checksum=a16f090252000d02a7f7d540bb10f7c1c9cd4ac5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:05 np0005539509 python3[30227]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:05 np0005539509 python3[30300]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:05 np0005539509 python3[30326]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:06 np0005539509 python3[30399]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:06 np0005539509 python3[30425]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:07 np0005539509 python3[30498]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:07 np0005539509 python3[30524]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:07 np0005539509 python3[30597]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:07 np0005539509 python3[30623]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:08 np0005539509 python3[30696]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:08 np0005539509 python3[30722]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:09 np0005539509 python3[30795]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.9069967-34046-40775915959999/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=25e801a9a05537c191e2aa500f19076ac31d3e5b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:20 np0005539509 python3[30844]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:58:20 np0005539509 systemd[1]: session-8.scope: Deactivated successfully.
Nov 29 00:58:20 np0005539509 systemd[1]: session-8.scope: Consumed 5.952s CPU time.
Nov 29 00:58:20 np0005539509 systemd-logind[785]: Session 8 logged out. Waiting for processes to exit.
Nov 29 00:58:20 np0005539509 systemd-logind[785]: Removed session 8.
Nov 29 01:06:27 np0005539509 systemd-logind[785]: New session 9 of user zuul.
Nov 29 01:06:27 np0005539509 systemd[1]: Started Session 9 of User zuul.
Nov 29 01:06:28 np0005539509 python3.9[31078]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:06:29 np0005539509 python3.9[31259]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:06:38 np0005539509 systemd[1]: session-9.scope: Deactivated successfully.
Nov 29 01:06:38 np0005539509 systemd[1]: session-9.scope: Consumed 8.530s CPU time.
Nov 29 01:06:38 np0005539509 systemd-logind[785]: Session 9 logged out. Waiting for processes to exit.
Nov 29 01:06:38 np0005539509 systemd-logind[785]: Removed session 9.
Nov 29 01:06:54 np0005539509 systemd-logind[785]: New session 10 of user zuul.
Nov 29 01:06:54 np0005539509 systemd[1]: Started Session 10 of User zuul.
Nov 29 01:06:55 np0005539509 python3.9[31473]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 29 01:06:56 np0005539509 python3.9[31647]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:06:57 np0005539509 python3.9[31799]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:06:59 np0005539509 python3.9[31952]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:07:00 np0005539509 python3.9[32104]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:07:01 np0005539509 python3.9[32256]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:07:02 np0005539509 python3.9[32381]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396420.8791122-183-233899878958317/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:07:02 np0005539509 irqbalance[781]: Cannot change IRQ 33 affinity: Operation not permitted
Nov 29 01:07:02 np0005539509 irqbalance[781]: IRQ 33 affinity is now unmanaged
Nov 29 01:07:03 np0005539509 python3.9[32533]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:07:04 np0005539509 python3.9[32689]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:07:05 np0005539509 python3.9[32841]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:07:06 np0005539509 python3.9[32991]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:07:10 np0005539509 python3.9[33244]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:07:10 np0005539509 python3.9[33394]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:07:12 np0005539509 python3.9[33548]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:07:13 np0005539509 python3.9[33706]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:07:14 np0005539509 python3.9[33790]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:08:00 np0005539509 systemd[1]: Reloading.
Nov 29 01:08:00 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:08:00 np0005539509 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 29 01:08:00 np0005539509 systemd[1]: Reloading.
Nov 29 01:08:00 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:08:00 np0005539509 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 29 01:08:00 np0005539509 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 29 01:08:01 np0005539509 systemd[1]: Reloading.
Nov 29 01:08:01 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:08:01 np0005539509 systemd[1]: Starting dnf makecache...
Nov 29 01:08:01 np0005539509 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 29 01:08:01 np0005539509 dnf[34081]: Failed determining last makecache time.
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-openstack-barbican-42b4c41831408a8e323 150 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 184 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-openstack-cinder-1c00d6490d88e436f26ef 167 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-python-stevedore-c4acc5639fd2329372142 187 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-python-cloudkitty-tests-tempest-2c80f8 207 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-os-net-config-9758ab42364673d01bc5014e 204 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 194 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-python-designate-tests-tempest-347fdbc 182 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-openstack-glance-1fd12c29b339f30fe823e 161 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 179 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-openstack-manila-3c01b7181572c95dac462 196 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-python-whitebox-neutron-tests-tempest- 206 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-openstack-octavia-ba397f07a7331190208c 187 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-openstack-watcher-c014f81a8647287f6dcc 173 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-python-tcib-1124124ec06aadbac34f0d340b 195 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 176 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-openstack-swift-dc98a8463506ac520c469a 187 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-python-tempestconf-8515371b7cceebd4282 182 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dnf[34081]: delorean-openstack-heat-ui-013accbfd179753bc3f0 169 kB/s | 3.0 kB     00:00
Nov 29 01:08:01 np0005539509 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 01:08:01 np0005539509 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 01:08:01 np0005539509 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 01:08:01 np0005539509 dnf[34081]: CentOS Stream 9 - BaseOS                         49 kB/s | 7.3 kB     00:00
Nov 29 01:08:02 np0005539509 dnf[34081]: CentOS Stream 9 - AppStream                      48 kB/s | 7.4 kB     00:00
Nov 29 01:08:02 np0005539509 dnf[34081]: CentOS Stream 9 - CRB                            73 kB/s | 7.2 kB     00:00
Nov 29 01:08:02 np0005539509 dnf[34081]: CentOS Stream 9 - Extras packages                29 kB/s | 8.3 kB     00:00
Nov 29 01:08:02 np0005539509 dnf[34081]: dlrn-antelope-testing                           141 kB/s | 3.0 kB     00:00
Nov 29 01:08:02 np0005539509 dnf[34081]: dlrn-antelope-build-deps                        175 kB/s | 3.0 kB     00:00
Nov 29 01:08:02 np0005539509 dnf[34081]: centos9-rabbitmq                                133 kB/s | 3.0 kB     00:00
Nov 29 01:08:02 np0005539509 dnf[34081]: centos9-storage                                 127 kB/s | 3.0 kB     00:00
Nov 29 01:08:02 np0005539509 dnf[34081]: centos9-opstools                                129 kB/s | 3.0 kB     00:00
Nov 29 01:08:02 np0005539509 dnf[34081]: NFV SIG OpenvSwitch                             131 kB/s | 3.0 kB     00:00
Nov 29 01:08:02 np0005539509 dnf[34081]: repo-setup-centos-appstream                     183 kB/s | 4.4 kB     00:00
Nov 29 01:08:03 np0005539509 dnf[34081]: repo-setup-centos-baseos                        151 kB/s | 3.9 kB     00:00
Nov 29 01:08:03 np0005539509 dnf[34081]: repo-setup-centos-highavailability              168 kB/s | 3.9 kB     00:00
Nov 29 01:08:03 np0005539509 dnf[34081]: repo-setup-centos-powertools                    175 kB/s | 4.3 kB     00:00
Nov 29 01:08:03 np0005539509 dnf[34081]: Extra Packages for Enterprise Linux 9 - x86_64   88 kB/s |  33 kB     00:00
Nov 29 01:08:04 np0005539509 dnf[34081]: Metadata cache created.
Nov 29 01:08:04 np0005539509 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 29 01:08:04 np0005539509 systemd[1]: Finished dnf makecache.
Nov 29 01:08:04 np0005539509 systemd[1]: dnf-makecache.service: Consumed 1.818s CPU time.
Nov 29 01:09:13 np0005539509 kernel: SELinux:  Converting 2718 SID table entries...
Nov 29 01:09:13 np0005539509 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:09:13 np0005539509 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:09:13 np0005539509 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:09:13 np0005539509 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:09:13 np0005539509 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:09:13 np0005539509 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:09:13 np0005539509 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:09:14 np0005539509 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 29 01:09:14 np0005539509 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:09:14 np0005539509 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:09:14 np0005539509 systemd[1]: Reloading.
Nov 29 01:09:14 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:09:14 np0005539509 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:09:16 np0005539509 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:09:16 np0005539509 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:09:16 np0005539509 systemd[1]: man-db-cache-update.service: Consumed 1.572s CPU time.
Nov 29 01:09:16 np0005539509 systemd[1]: run-r98930854f7fc4a1cabb4738f9b03ebf0.service: Deactivated successfully.
Nov 29 01:09:18 np0005539509 python3.9[35361]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:09:21 np0005539509 python3.9[35644]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 29 01:09:22 np0005539509 python3.9[35796]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 29 01:09:26 np0005539509 python3.9[35950]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:09:27 np0005539509 python3.9[36102]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 29 01:09:30 np0005539509 python3.9[36254]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:09:33 np0005539509 python3.9[36406]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:09:37 np0005539509 python3.9[36529]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396572.4090178-672-115381945327346/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:09:38 np0005539509 python3.9[36683]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:09:39 np0005539509 python3.9[36835]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:09:40 np0005539509 python3.9[36988]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:09:41 np0005539509 python3.9[37140]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 29 01:09:42 np0005539509 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:09:43 np0005539509 python3.9[37294]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:09:44 np0005539509 python3.9[37452]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:09:45 np0005539509 python3.9[37612]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 29 01:09:46 np0005539509 python3.9[37765]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:09:47 np0005539509 python3.9[37923]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 29 01:09:48 np0005539509 python3.9[38075]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:09:51 np0005539509 python3.9[38228]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:09:52 np0005539509 python3.9[38380]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:09:52 np0005539509 python3.9[38503]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396591.5119562-1029-78695063460409/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:09:53 np0005539509 python3.9[38655]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:09:54 np0005539509 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:09:54 np0005539509 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 29 01:09:54 np0005539509 kernel: Bridge firewalling registered
Nov 29 01:09:54 np0005539509 systemd-modules-load[38659]: Inserted module 'br_netfilter'
Nov 29 01:09:54 np0005539509 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:09:55 np0005539509 python3.9[38814]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:09:55 np0005539509 python3.9[38937]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396594.6318362-1098-171061386826049/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:09:57 np0005539509 python3.9[39089]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:10:02 np0005539509 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 01:10:02 np0005539509 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 01:10:02 np0005539509 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:10:02 np0005539509 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:10:02 np0005539509 systemd[1]: Reloading.
Nov 29 01:10:03 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:10:03 np0005539509 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:10:04 np0005539509 python3.9[40574]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:10:05 np0005539509 python3.9[41503]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 29 01:10:06 np0005539509 python3.9[42193]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:10:07 np0005539509 python3.9[43072]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:10:07 np0005539509 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 01:10:07 np0005539509 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:10:07 np0005539509 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:10:07 np0005539509 systemd[1]: man-db-cache-update.service: Consumed 5.882s CPU time.
Nov 29 01:10:07 np0005539509 systemd[1]: run-rb9bf4a3f2bde443fa32c2d2735b599e3.service: Deactivated successfully.
Nov 29 01:10:08 np0005539509 systemd[1]: Starting Authorization Manager...
Nov 29 01:10:08 np0005539509 polkitd[43499]: Started polkitd version 0.117
Nov 29 01:10:08 np0005539509 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 01:10:08 np0005539509 systemd[1]: Started Authorization Manager.
Nov 29 01:10:09 np0005539509 python3.9[43669]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:10:09 np0005539509 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 29 01:10:09 np0005539509 systemd[1]: tuned.service: Deactivated successfully.
Nov 29 01:10:09 np0005539509 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 29 01:10:09 np0005539509 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 01:10:09 np0005539509 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 01:10:10 np0005539509 python3.9[43831]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 29 01:10:14 np0005539509 python3.9[43983]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:10:14 np0005539509 systemd[1]: Reloading.
Nov 29 01:10:14 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:10:15 np0005539509 python3.9[44173]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:10:15 np0005539509 systemd[1]: Reloading.
Nov 29 01:10:15 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:10:17 np0005539509 python3.9[44362]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:10:17 np0005539509 python3.9[44515]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:10:17 np0005539509 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 29 01:10:18 np0005539509 python3.9[44668]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:10:21 np0005539509 python3.9[44830]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:10:22 np0005539509 python3.9[44983]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:10:22 np0005539509 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 01:10:22 np0005539509 systemd[1]: Stopped Apply Kernel Variables.
Nov 29 01:10:22 np0005539509 systemd[1]: Stopping Apply Kernel Variables...
Nov 29 01:10:22 np0005539509 systemd[1]: Starting Apply Kernel Variables...
Nov 29 01:10:22 np0005539509 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 01:10:22 np0005539509 systemd[1]: Finished Apply Kernel Variables.
Nov 29 01:10:22 np0005539509 systemd[1]: session-10.scope: Deactivated successfully.
Nov 29 01:10:22 np0005539509 systemd[1]: session-10.scope: Consumed 2min 28.892s CPU time.
Nov 29 01:10:22 np0005539509 systemd-logind[785]: Session 10 logged out. Waiting for processes to exit.
Nov 29 01:10:22 np0005539509 systemd-logind[785]: Removed session 10.
Nov 29 01:10:28 np0005539509 systemd-logind[785]: New session 11 of user zuul.
Nov 29 01:10:29 np0005539509 systemd[1]: Started Session 11 of User zuul.
Nov 29 01:10:30 np0005539509 python3.9[45166]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:10:31 np0005539509 python3.9[45322]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 29 01:10:32 np0005539509 python3.9[45475]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:10:34 np0005539509 python3.9[45633]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:10:35 np0005539509 python3.9[45793]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:10:36 np0005539509 python3.9[45877]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:10:40 np0005539509 python3.9[46043]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:10:55 np0005539509 kernel: SELinux:  Converting 2730 SID table entries...
Nov 29 01:10:55 np0005539509 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:10:55 np0005539509 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:10:55 np0005539509 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:10:55 np0005539509 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:10:55 np0005539509 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:10:55 np0005539509 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:10:55 np0005539509 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:10:55 np0005539509 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 29 01:10:55 np0005539509 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 29 01:10:57 np0005539509 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:10:57 np0005539509 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:10:57 np0005539509 systemd[1]: Reloading.
Nov 29 01:10:57 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:10:57 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:10:57 np0005539509 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:10:58 np0005539509 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:10:58 np0005539509 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:10:58 np0005539509 systemd[1]: man-db-cache-update.service: Consumed 1.024s CPU time.
Nov 29 01:10:58 np0005539509 systemd[1]: run-r90c2b8d4a6594a5f9b46a5416e65883c.service: Deactivated successfully.
Nov 29 01:10:59 np0005539509 python3.9[47145]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:10:59 np0005539509 systemd[1]: Reloading.
Nov 29 01:10:59 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:10:59 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:10:59 np0005539509 systemd[1]: Starting Open vSwitch Database Unit...
Nov 29 01:10:59 np0005539509 chown[47187]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 29 01:10:59 np0005539509 ovs-ctl[47192]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 29 01:10:59 np0005539509 ovs-ctl[47192]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 29 01:10:59 np0005539509 ovs-ctl[47192]: Starting ovsdb-server [  OK  ]
Nov 29 01:10:59 np0005539509 ovs-vsctl[47241]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 29 01:11:00 np0005539509 ovs-vsctl[47257]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2fa83236-07b6-4ff7-bb56-9f4f13bed719\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 29 01:11:00 np0005539509 ovs-ctl[47192]: Configuring Open vSwitch system IDs [  OK  ]
Nov 29 01:11:00 np0005539509 ovs-ctl[47192]: Enabling remote OVSDB managers [  OK  ]
Nov 29 01:11:00 np0005539509 systemd[1]: Started Open vSwitch Database Unit.
Nov 29 01:11:00 np0005539509 ovs-vsctl[47266]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 29 01:11:00 np0005539509 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 29 01:11:00 np0005539509 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 29 01:11:00 np0005539509 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 29 01:11:00 np0005539509 kernel: openvswitch: Open vSwitch switching datapath
Nov 29 01:11:00 np0005539509 ovs-ctl[47310]: Inserting openvswitch module [  OK  ]
Nov 29 01:11:00 np0005539509 ovs-ctl[47279]: Starting ovs-vswitchd [  OK  ]
Nov 29 01:11:00 np0005539509 ovs-vsctl[47327]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 29 01:11:00 np0005539509 ovs-ctl[47279]: Enabling remote OVSDB managers [  OK  ]
Nov 29 01:11:00 np0005539509 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 29 01:11:00 np0005539509 systemd[1]: Starting Open vSwitch...
Nov 29 01:11:00 np0005539509 systemd[1]: Finished Open vSwitch.
Nov 29 01:11:01 np0005539509 python3.9[47479]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:11:02 np0005539509 python3.9[47631]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 29 01:11:03 np0005539509 kernel: SELinux:  Converting 2744 SID table entries...
Nov 29 01:11:03 np0005539509 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:11:03 np0005539509 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:11:03 np0005539509 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:11:03 np0005539509 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:11:03 np0005539509 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:11:03 np0005539509 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:11:03 np0005539509 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:11:05 np0005539509 python3.9[47786]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:11:05 np0005539509 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 29 01:11:06 np0005539509 python3.9[47944]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:11:08 np0005539509 python3.9[48097]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:11:10 np0005539509 python3.9[48384]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:11:11 np0005539509 python3.9[48534]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:11:12 np0005539509 python3.9[48688]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:11:13 np0005539509 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:11:13 np0005539509 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:11:13 np0005539509 systemd[1]: Reloading.
Nov 29 01:11:14 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:11:14 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:11:14 np0005539509 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:11:14 np0005539509 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:11:14 np0005539509 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:11:14 np0005539509 systemd[1]: run-r1d7b09ca33f74723a8f1f66caec391a5.service: Deactivated successfully.
Nov 29 01:11:15 np0005539509 python3.9[49006]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:11:15 np0005539509 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 01:11:15 np0005539509 systemd[1]: Stopped Network Manager Wait Online.
Nov 29 01:11:15 np0005539509 systemd[1]: Stopping Network Manager Wait Online...
Nov 29 01:11:15 np0005539509 systemd[1]: Stopping Network Manager...
Nov 29 01:11:15 np0005539509 NetworkManager[7192]: <info>  [1764396675.7907] caught SIGTERM, shutting down normally.
Nov 29 01:11:15 np0005539509 NetworkManager[7192]: <info>  [1764396675.7928] dhcp4 (eth0): canceled DHCP transaction
Nov 29 01:11:15 np0005539509 NetworkManager[7192]: <info>  [1764396675.7929] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:11:15 np0005539509 NetworkManager[7192]: <info>  [1764396675.7929] dhcp4 (eth0): state changed no lease
Nov 29 01:11:15 np0005539509 NetworkManager[7192]: <info>  [1764396675.7931] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 01:11:15 np0005539509 NetworkManager[7192]: <info>  [1764396675.8010] exiting (success)
Nov 29 01:11:15 np0005539509 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 01:11:15 np0005539509 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 01:11:15 np0005539509 systemd[1]: Stopped Network Manager.
Nov 29 01:11:15 np0005539509 systemd[1]: NetworkManager.service: Consumed 14.026s CPU time, 4.1M memory peak, read 0B from disk, written 37.5K to disk.
Nov 29 01:11:15 np0005539509 systemd[1]: Starting Network Manager...
Nov 29 01:11:15 np0005539509 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.8686] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:ba0dbca2-f496-4536-953e-379bfe5fc9e9)
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.8687] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.8763] manager[0x56216546a090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 01:11:15 np0005539509 systemd[1]: Starting Hostname Service...
Nov 29 01:11:15 np0005539509 systemd[1]: Started Hostname Service.
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9769] hostname: hostname: using hostnamed
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9771] hostname: static hostname changed from (none) to "compute-1"
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9779] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9785] manager[0x56216546a090]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9786] manager[0x56216546a090]: rfkill: WWAN hardware radio set enabled
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9824] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9840] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9841] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9842] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9843] manager: Networking is enabled by state file
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9846] settings: Loaded settings plugin: keyfile (internal)
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9852] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9907] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9923] dhcp: init: Using DHCP client 'internal'
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9928] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9939] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9948] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9961] device (lo): Activation: starting connection 'lo' (e88f289b-57af-451c-b662-f7b3e0248e91)
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9972] device (eth0): carrier: link connected
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9981] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9989] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 01:11:15 np0005539509 NetworkManager[49015]: <info>  [1764396675.9990] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0001] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0013] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0024] device (eth1): carrier: link connected
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0032] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0043] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (f58d442a-350a-5956-a954-8dae41cac9cb) (indicated)
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0044] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0051] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0063] device (eth1): Activation: starting connection 'ci-private-network' (f58d442a-350a-5956-a954-8dae41cac9cb)
Nov 29 01:11:16 np0005539509 systemd[1]: Started Network Manager.
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0072] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0088] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0093] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0096] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0100] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0105] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0109] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0114] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0120] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0131] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0136] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0150] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0170] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0185] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0190] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0196] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0205] device (lo): Activation: successful, device activated.
Nov 29 01:11:16 np0005539509 systemd[1]: Starting Network Manager Wait Online...
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0220] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0300] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0311] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0322] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0328] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0335] device (eth1): Activation: successful, device activated.
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0349] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0351] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0357] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0363] device (eth0): Activation: successful, device activated.
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0371] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 01:11:16 np0005539509 NetworkManager[49015]: <info>  [1764396676.0414] manager: startup complete
Nov 29 01:11:16 np0005539509 systemd[1]: Finished Network Manager Wait Online.
Nov 29 01:11:17 np0005539509 python3.9[49232]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:11:23 np0005539509 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:11:24 np0005539509 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:11:24 np0005539509 systemd[1]: Reloading.
Nov 29 01:11:24 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:11:24 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:11:24 np0005539509 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:11:25 np0005539509 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:11:25 np0005539509 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:11:25 np0005539509 systemd[1]: run-re19393818e514a5ca2b2f35d9632efbd.service: Deactivated successfully.
Nov 29 01:11:26 np0005539509 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 01:11:30 np0005539509 python3.9[49691]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:11:31 np0005539509 python3.9[49843]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:32 np0005539509 python3.9[49997]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:33 np0005539509 python3.9[50149]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:33 np0005539509 python3.9[50301]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:34 np0005539509 python3.9[50453]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:35 np0005539509 python3.9[50605]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:11:36 np0005539509 python3.9[50728]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396694.7852724-653-19918081996793/.source _original_basename=.i40v_3pn follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:36 np0005539509 python3.9[50880]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:37 np0005539509 python3.9[51032]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 29 01:11:38 np0005539509 python3.9[51184]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:41 np0005539509 python3.9[51611]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 29 01:11:42 np0005539509 ansible-async_wrapper.py[51786]: Invoked with j640183032971 300 /home/zuul/.ansible/tmp/ansible-tmp-1764396701.789081-851-279638518384644/AnsiballZ_edpm_os_net_config.py _
Nov 29 01:11:42 np0005539509 ansible-async_wrapper.py[51789]: Starting module and watcher
Nov 29 01:11:42 np0005539509 ansible-async_wrapper.py[51789]: Start watching 51790 (300)
Nov 29 01:11:42 np0005539509 ansible-async_wrapper.py[51790]: Start module (51790)
Nov 29 01:11:42 np0005539509 ansible-async_wrapper.py[51786]: Return async_wrapper task started.
Nov 29 01:11:42 np0005539509 python3.9[51791]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 29 01:11:43 np0005539509 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 29 01:11:43 np0005539509 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 29 01:11:43 np0005539509 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 29 01:11:43 np0005539509 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 29 01:11:43 np0005539509 kernel: cfg80211: failed to load regulatory.db
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.8463] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.8483] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.8964] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.8965] audit: op="connection-add" uuid="b952c8cb-7611-4778-be4c-bc06323a4506" name="br-ex-br" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.8989] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.8991] audit: op="connection-add" uuid="49f990b4-775c-4e02-bc38-e1a6aa226fe6" name="br-ex-port" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9009] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9010] audit: op="connection-add" uuid="d660df85-a2ac-4a29-8373-2f101d79d67f" name="eth1-port" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9027] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9028] audit: op="connection-add" uuid="41c18e8e-76c3-4e04-b760-435f81f44e36" name="vlan20-port" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9046] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9047] audit: op="connection-add" uuid="7a2c3747-3af2-4f7f-805f-3e293cbd8731" name="vlan21-port" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9063] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9064] audit: op="connection-add" uuid="4ab8e7b1-0190-4e16-a357-42a51c21fc46" name="vlan22-port" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9079] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9080] audit: op="connection-add" uuid="f976a7c4-f8bd-4537-9323-2d66bbc42bb6" name="vlan23-port" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9105] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9125] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9126] audit: op="connection-add" uuid="e861a275-5e1d-46ad-8a25-d41c1b167ab9" name="br-ex-if" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9186] audit: op="connection-update" uuid="f58d442a-350a-5956-a954-8dae41cac9cb" name="ci-private-network" args="ipv6.dns,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routing-rules,ipv6.routes,ipv6.method,ovs-external-ids.data,ipv4.dns,ipv4.addresses,ipv4.routes,ipv4.routing-rules,ipv4.method,ipv4.never-default,ovs-interface.type,connection.slave-type,connection.master,connection.controller,connection.port-type,connection.timestamp" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9206] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9207] audit: op="connection-add" uuid="f4fde273-6e7e-4204-b63a-acc23e947b61" name="vlan20-if" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9226] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9227] audit: op="connection-add" uuid="789f8a59-33d5-4ba4-9524-ea4d771fc63e" name="vlan21-if" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9247] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9249] audit: op="connection-add" uuid="76c2f154-19b3-46e5-a04d-a31e660b0888" name="vlan22-if" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9270] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9271] audit: op="connection-add" uuid="651303cc-960e-4019-b96b-2d109f1db40f" name="vlan23-if" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9288] audit: op="connection-delete" uuid="869e6d79-5f7b-3b9f-b76e-078155d890a2" name="Wired connection 1" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9302] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9313] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9318] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (b952c8cb-7611-4778-be4c-bc06323a4506)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9319] audit: op="connection-activate" uuid="b952c8cb-7611-4778-be4c-bc06323a4506" name="br-ex-br" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9320] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9328] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9332] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (49f990b4-775c-4e02-bc38-e1a6aa226fe6)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9333] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9339] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9344] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (d660df85-a2ac-4a29-8373-2f101d79d67f)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9348] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9354] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9358] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (41c18e8e-76c3-4e04-b760-435f81f44e36)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9360] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9367] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9371] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (7a2c3747-3af2-4f7f-805f-3e293cbd8731)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9372] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9378] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9383] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (4ab8e7b1-0190-4e16-a357-42a51c21fc46)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9384] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9391] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9395] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (f976a7c4-f8bd-4537-9323-2d66bbc42bb6)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9396] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9398] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9399] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9407] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9411] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9415] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (e861a275-5e1d-46ad-8a25-d41c1b167ab9)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9416] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9419] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9420] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9421] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9422] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9431] device (eth1): disconnecting for new activation request.
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9431] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9433] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9435] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9435] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9438] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9441] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9444] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (f4fde273-6e7e-4204-b63a-acc23e947b61)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9444] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9446] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9473] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9475] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9479] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9486] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9491] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (789f8a59-33d5-4ba4-9524-ea4d771fc63e)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9493] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9497] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9499] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9501] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9505] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9511] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9517] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (76c2f154-19b3-46e5-a04d-a31e660b0888)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9518] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9521] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9524] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9526] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9530] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9536] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9543] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (651303cc-960e-4019-b96b-2d109f1db40f)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9544] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9548] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9550] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9552] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9554] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9570] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9573] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9577] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9580] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9588] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9593] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9598] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9603] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9605] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9612] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9617] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9622] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9625] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9631] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9636] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9641] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9644] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9650] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9656] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9660] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9663] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9669] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9675] dhcp4 (eth0): canceled DHCP transaction
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9675] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9675] dhcp4 (eth0): state changed no lease
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9677] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9690] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51792 uid=0 result="fail" reason="Device is not activated"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9731] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9768] device (eth1): disconnecting for new activation request.
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9768] audit: op="connection-activate" uuid="f58d442a-350a-5956-a954-8dae41cac9cb" name="ci-private-network" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9773] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9781] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 29 01:11:44 np0005539509 kernel: ovs-system: entered promiscuous mode
Nov 29 01:11:44 np0005539509 kernel: Timeout policy base is empty
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9809] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9818] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51792 uid=0 result="success"
Nov 29 01:11:44 np0005539509 systemd-udevd[51797]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9836] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 29 01:11:44 np0005539509 NetworkManager[49015]: <info>  [1764396704.9856] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Nov 29 01:11:44 np0005539509 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 01:11:45 np0005539509 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0068] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0214] device (eth1): Activation: starting connection 'ci-private-network' (f58d442a-350a-5956-a954-8dae41cac9cb)
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0232] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0261] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0274] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 kernel: br-ex: entered promiscuous mode
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0303] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0312] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0320] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0321] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0323] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0325] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0327] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0329] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0349] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0357] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0362] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0366] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0373] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0382] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0388] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0394] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0400] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0406] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0411] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0427] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0431] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0438] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0447] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0460] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:11:45 np0005539509 kernel: vlan22: entered promiscuous mode
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0487] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0502] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0504] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0509] device (eth1): Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0555] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0560] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0569] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 kernel: vlan23: entered promiscuous mode
Nov 29 01:11:45 np0005539509 systemd-udevd[51798]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0631] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0654] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0698] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0700] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0710] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 kernel: vlan20: entered promiscuous mode
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.0977] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1005] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1039] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1041] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1052] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1070] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1094] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 kernel: vlan21: entered promiscuous mode
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1121] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1123] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1127] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539509 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1245] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1267] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1300] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1302] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539509 NetworkManager[49015]: <info>  [1764396705.1311] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:11:46 np0005539509 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 01:11:46 np0005539509 NetworkManager[49015]: <info>  [1764396706.2161] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51792 uid=0 result="success"
Nov 29 01:11:46 np0005539509 NetworkManager[49015]: <info>  [1764396706.4124] checkpoint[0x562165440950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 29 01:11:46 np0005539509 NetworkManager[49015]: <info>  [1764396706.4127] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51792 uid=0 result="success"
Nov 29 01:11:46 np0005539509 python3.9[52151]: ansible-ansible.legacy.async_status Invoked with jid=j640183032971.51786 mode=status _async_dir=/root/.ansible_async
Nov 29 01:11:46 np0005539509 NetworkManager[49015]: <info>  [1764396706.7778] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51792 uid=0 result="success"
Nov 29 01:11:46 np0005539509 NetworkManager[49015]: <info>  [1764396706.7797] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51792 uid=0 result="success"
Nov 29 01:11:47 np0005539509 NetworkManager[49015]: <info>  [1764396707.0845] audit: op="networking-control" arg="global-dns-configuration" pid=51792 uid=0 result="success"
Nov 29 01:11:47 np0005539509 NetworkManager[49015]: <info>  [1764396707.0899] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 29 01:11:47 np0005539509 NetworkManager[49015]: <info>  [1764396707.0947] audit: op="networking-control" arg="global-dns-configuration" pid=51792 uid=0 result="success"
Nov 29 01:11:47 np0005539509 NetworkManager[49015]: <info>  [1764396707.0987] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51792 uid=0 result="success"
Nov 29 01:11:47 np0005539509 NetworkManager[49015]: <info>  [1764396707.3567] checkpoint[0x562165440a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 29 01:11:47 np0005539509 NetworkManager[49015]: <info>  [1764396707.3572] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51792 uid=0 result="success"
Nov 29 01:11:47 np0005539509 ansible-async_wrapper.py[51790]: Module complete (51790)
Nov 29 01:11:47 np0005539509 ansible-async_wrapper.py[51789]: Done in kid B.
Nov 29 01:11:50 np0005539509 python3.9[52257]: ansible-ansible.legacy.async_status Invoked with jid=j640183032971.51786 mode=status _async_dir=/root/.ansible_async
Nov 29 01:11:50 np0005539509 python3.9[52357]: ansible-ansible.legacy.async_status Invoked with jid=j640183032971.51786 mode=cleanup _async_dir=/root/.ansible_async
Nov 29 01:11:51 np0005539509 python3.9[52509]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:11:52 np0005539509 python3.9[52632]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396711.0557082-932-210303002402204/.source.returncode _original_basename=.t0pbo1y7 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:53 np0005539509 python3.9[52784]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:11:53 np0005539509 python3.9[52908]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396712.5962775-980-47150206159258/.source.cfg _original_basename=.f60mxffo follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:54 np0005539509 python3.9[53060]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:11:55 np0005539509 systemd[1]: Reloading Network Manager...
Nov 29 01:11:55 np0005539509 NetworkManager[49015]: <info>  [1764396715.0571] audit: op="reload" arg="0" pid=53064 uid=0 result="success"
Nov 29 01:11:55 np0005539509 NetworkManager[49015]: <info>  [1764396715.0582] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 29 01:11:55 np0005539509 systemd[1]: Reloaded Network Manager.
Nov 29 01:11:55 np0005539509 systemd[1]: session-11.scope: Deactivated successfully.
Nov 29 01:11:55 np0005539509 systemd[1]: session-11.scope: Consumed 59.132s CPU time.
Nov 29 01:11:55 np0005539509 systemd-logind[785]: Session 11 logged out. Waiting for processes to exit.
Nov 29 01:11:55 np0005539509 systemd-logind[785]: Removed session 11.
Nov 29 01:12:00 np0005539509 systemd-logind[785]: New session 12 of user zuul.
Nov 29 01:12:00 np0005539509 systemd[1]: Started Session 12 of User zuul.
Nov 29 01:12:01 np0005539509 python3.9[53250]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:12:02 np0005539509 python3.9[53404]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:12:05 np0005539509 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 01:12:05 np0005539509 python3.9[53598]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:12:06 np0005539509 systemd[1]: session-12.scope: Deactivated successfully.
Nov 29 01:12:06 np0005539509 systemd[1]: session-12.scope: Consumed 2.652s CPU time.
Nov 29 01:12:06 np0005539509 systemd-logind[785]: Session 12 logged out. Waiting for processes to exit.
Nov 29 01:12:06 np0005539509 systemd-logind[785]: Removed session 12.
Nov 29 01:12:11 np0005539509 systemd-logind[785]: New session 13 of user zuul.
Nov 29 01:12:11 np0005539509 systemd[1]: Started Session 13 of User zuul.
Nov 29 01:12:13 np0005539509 python3.9[53784]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:12:14 np0005539509 python3.9[53938]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:12:15 np0005539509 python3.9[54094]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:12:16 np0005539509 python3.9[54179]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:12:18 np0005539509 python3.9[54332]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:12:21 np0005539509 python3.9[54527]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:12:22 np0005539509 python3.9[54679]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:12:22 np0005539509 systemd[1]: var-lib-containers-storage-overlay-compat3573815175-merged.mount: Deactivated successfully.
Nov 29 01:12:22 np0005539509 podman[54680]: 2025-11-29 06:12:22.192239362 +0000 UTC m=+0.058904675 system refresh
Nov 29 01:12:23 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:12:24 np0005539509 python3.9[54843]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:12:25 np0005539509 python3.9[54966]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396744.131724-203-102553718786577/.source.json follow=False _original_basename=podman_network_config.j2 checksum=22f94c64376a85d67765fd46a234a717ce2c216b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:12:26 np0005539509 python3.9[55118]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:12:27 np0005539509 python3.9[55241]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396745.7698796-248-176156533813758/.source.conf follow=False _original_basename=registries.conf.j2 checksum=25aa6c560e50dcbd81b989ea46a7865cb55b8998 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:12:28 np0005539509 python3.9[55393]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:12:28 np0005539509 python3.9[55545]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:12:29 np0005539509 python3.9[55697]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:12:30 np0005539509 python3.9[55849]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:12:31 np0005539509 python3.9[56001]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:12:33 np0005539509 python3.9[56154]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:12:34 np0005539509 python3.9[56308]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:12:35 np0005539509 python3.9[56460]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:12:36 np0005539509 python3.9[56612]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:12:38 np0005539509 python3.9[56765]: ansible-service_facts Invoked
Nov 29 01:12:38 np0005539509 network[56782]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:12:38 np0005539509 network[56783]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:12:38 np0005539509 network[56784]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:12:44 np0005539509 python3.9[57236]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:12:48 np0005539509 python3.9[57389]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 29 01:12:50 np0005539509 python3.9[57541]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:12:51 np0005539509 python3.9[57666]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396769.8162096-680-95232814609151/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:12:51 np0005539509 python3.9[57820]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:12:52 np0005539509 python3.9[57945]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396771.4413092-726-163996507871881/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:12:54 np0005539509 python3.9[58099]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:12:56 np0005539509 python3.9[58253]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:12:57 np0005539509 python3.9[58337]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:12:59 np0005539509 python3.9[58491]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:12:59 np0005539509 python3.9[58575]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:13:00 np0005539509 systemd[1]: Stopping NTP client/server...
Nov 29 01:13:00 np0005539509 chronyd[792]: chronyd exiting
Nov 29 01:13:00 np0005539509 systemd[1]: chronyd.service: Deactivated successfully.
Nov 29 01:13:00 np0005539509 systemd[1]: Stopped NTP client/server.
Nov 29 01:13:00 np0005539509 systemd[1]: Starting NTP client/server...
Nov 29 01:13:00 np0005539509 chronyd[58583]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 01:13:00 np0005539509 chronyd[58583]: Frequency -26.426 +/- 0.401 ppm read from /var/lib/chrony/drift
Nov 29 01:13:00 np0005539509 chronyd[58583]: Loaded seccomp filter (level 2)
Nov 29 01:13:00 np0005539509 systemd[1]: Started NTP client/server.
Nov 29 01:13:00 np0005539509 systemd[1]: session-13.scope: Deactivated successfully.
Nov 29 01:13:00 np0005539509 systemd[1]: session-13.scope: Consumed 30.578s CPU time.
Nov 29 01:13:00 np0005539509 systemd-logind[785]: Session 13 logged out. Waiting for processes to exit.
Nov 29 01:13:00 np0005539509 systemd-logind[785]: Removed session 13.
Nov 29 01:13:06 np0005539509 systemd-logind[785]: New session 14 of user zuul.
Nov 29 01:13:06 np0005539509 systemd[1]: Started Session 14 of User zuul.
Nov 29 01:13:07 np0005539509 python3.9[58764]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:07 np0005539509 python3.9[58918]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:08 np0005539509 python3.9[59041]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396787.3173635-69-200403109098984/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:09 np0005539509 systemd[1]: session-14.scope: Deactivated successfully.
Nov 29 01:13:09 np0005539509 systemd[1]: session-14.scope: Consumed 1.834s CPU time.
Nov 29 01:13:09 np0005539509 systemd-logind[785]: Session 14 logged out. Waiting for processes to exit.
Nov 29 01:13:09 np0005539509 systemd-logind[785]: Removed session 14.
Nov 29 01:13:14 np0005539509 systemd-logind[785]: New session 15 of user zuul.
Nov 29 01:13:14 np0005539509 systemd[1]: Started Session 15 of User zuul.
Nov 29 01:13:15 np0005539509 python3.9[59221]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:13:16 np0005539509 python3.9[59377]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:17 np0005539509 python3.9[59552]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:18 np0005539509 python3.9[59675]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764396797.1468072-88-185448013696681/.source.json _original_basename=.vkia39fo follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:19 np0005539509 python3.9[59827]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:20 np0005539509 python3.9[59952]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396799.286177-157-152693036389979/.source _original_basename=.pqini69z follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:21 np0005539509 python3.9[60104]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:13:22 np0005539509 python3.9[60256]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:23 np0005539509 python3.9[60379]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396801.9029558-229-161521784381972/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:13:23 np0005539509 python3.9[60531]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:24 np0005539509 python3.9[60654]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396803.1960824-229-136296519110047/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:13:25 np0005539509 python3.9[60806]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:26 np0005539509 python3.9[60958]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:26 np0005539509 python3.9[61081]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396805.4823601-341-129878485737285/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:27 np0005539509 python3.9[61233]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:28 np0005539509 python3.9[61356]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396806.9583416-385-192888254872458/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:29 np0005539509 python3.9[61508]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:13:29 np0005539509 systemd[1]: Reloading.
Nov 29 01:13:29 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:29 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:30 np0005539509 systemd[1]: Reloading.
Nov 29 01:13:30 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:30 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:30 np0005539509 systemd[1]: Starting EDPM Container Shutdown...
Nov 29 01:13:30 np0005539509 systemd[1]: Finished EDPM Container Shutdown.
Nov 29 01:13:31 np0005539509 python3.9[61734]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:32 np0005539509 python3.9[61857]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396810.7913022-454-131646669638305/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:32 np0005539509 python3.9[62009]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:33 np0005539509 python3.9[62132]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396812.2790513-499-206083136035783/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:34 np0005539509 python3.9[62284]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:13:34 np0005539509 systemd[1]: Reloading.
Nov 29 01:13:34 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:34 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:34 np0005539509 systemd[1]: Reloading.
Nov 29 01:13:34 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:34 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:35 np0005539509 systemd[1]: Starting Create netns directory...
Nov 29 01:13:35 np0005539509 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:13:35 np0005539509 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:13:35 np0005539509 systemd[1]: Finished Create netns directory.
Nov 29 01:13:36 np0005539509 python3.9[62511]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:13:36 np0005539509 network[62528]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:13:36 np0005539509 network[62529]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:13:36 np0005539509 network[62530]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:13:41 np0005539509 python3.9[62792]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:13:41 np0005539509 systemd[1]: Reloading.
Nov 29 01:13:41 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:41 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:41 np0005539509 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 29 01:13:41 np0005539509 iptables.init[62832]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 29 01:13:42 np0005539509 iptables.init[62832]: iptables: Flushing firewall rules: [  OK  ]
Nov 29 01:13:42 np0005539509 systemd[1]: iptables.service: Deactivated successfully.
Nov 29 01:13:42 np0005539509 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 29 01:13:43 np0005539509 python3.9[63028]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:13:45 np0005539509 python3.9[63182]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:13:45 np0005539509 systemd[1]: Reloading.
Nov 29 01:13:45 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:45 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:45 np0005539509 systemd[1]: Starting Netfilter Tables...
Nov 29 01:13:45 np0005539509 systemd[1]: Finished Netfilter Tables.
Nov 29 01:13:46 np0005539509 python3.9[63374]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:13:47 np0005539509 python3.9[63527]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:48 np0005539509 python3.9[63652]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396827.1233733-706-270345891972877/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:49 np0005539509 python3.9[63805]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:13:49 np0005539509 systemd[1]: Reloading OpenSSH server daemon...
Nov 29 01:13:49 np0005539509 systemd[1]: Reloaded OpenSSH server daemon.
Nov 29 01:13:50 np0005539509 python3.9[63961]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:51 np0005539509 python3.9[64113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:51 np0005539509 python3.9[64236]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396830.5914392-799-101510981831214/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:53 np0005539509 python3.9[64388]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 01:13:53 np0005539509 systemd[1]: Starting Time & Date Service...
Nov 29 01:13:53 np0005539509 systemd[1]: Started Time & Date Service.
Nov 29 01:13:54 np0005539509 python3.9[64544]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:55 np0005539509 python3.9[64696]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:55 np0005539509 python3.9[64819]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396834.504613-904-5399701062953/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:56 np0005539509 python3.9[64971]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:57 np0005539509 python3.9[65094]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396836.004822-950-188393539029024/.source.yaml _original_basename=.imv88qg5 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:58 np0005539509 python3.9[65246]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:58 np0005539509 python3.9[65369]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396837.575449-994-70796429740603/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:59 np0005539509 python3.9[65521]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:00 np0005539509 python3.9[65674]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:01 np0005539509 python3[65827]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:14:02 np0005539509 python3.9[65979]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:14:03 np0005539509 python3.9[66102]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396841.9217458-1111-149388908292550/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:04 np0005539509 python3.9[66254]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:14:04 np0005539509 python3.9[66377]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396843.5359545-1156-265707800178324/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:05 np0005539509 python3.9[66529]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:14:06 np0005539509 python3.9[66652]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396845.142922-1201-184815662986162/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:07 np0005539509 python3.9[66804]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:14:07 np0005539509 python3.9[66927]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396846.664679-1246-247228150418312/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:08 np0005539509 python3.9[67079]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:14:09 np0005539509 python3.9[67202]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396848.2022111-1291-160402387134504/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:10 np0005539509 python3.9[67354]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:11 np0005539509 python3.9[67506]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:12 np0005539509 python3.9[67665]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:13 np0005539509 python3.9[67818]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:14 np0005539509 python3.9[67970]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:15 np0005539509 python3.9[68122]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:14:16 np0005539509 python3.9[68275]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:14:16 np0005539509 systemd[1]: session-15.scope: Deactivated successfully.
Nov 29 01:14:16 np0005539509 systemd[1]: session-15.scope: Consumed 43.077s CPU time.
Nov 29 01:14:16 np0005539509 systemd-logind[785]: Session 15 logged out. Waiting for processes to exit.
Nov 29 01:14:16 np0005539509 systemd-logind[785]: Removed session 15.
Nov 29 01:14:23 np0005539509 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 01:14:26 np0005539509 systemd-logind[785]: New session 16 of user zuul.
Nov 29 01:14:26 np0005539509 systemd[1]: Started Session 16 of User zuul.
Nov 29 01:14:27 np0005539509 python3.9[68459]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 29 01:14:28 np0005539509 python3.9[68613]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:14:29 np0005539509 python3.9[68765]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:14:30 np0005539509 python3.9[68917]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCX0dhB1m0xL0qEi5jnTQLLB4bvueVV5foNrqU/OkfV/4gRyp7uP2q21lWq5Dtl2GLk51pS6oD41RI41Y5g7OSRs8b1Z66d6X1QgX0Qns6pv7FwmNSQ25+2VGV6lppnaN5e+JHiwTmzpf82hl/MiiJrHo7B63mllKyl9SZJxUhP9RR4czS3QNYQsZyP7sZeCWothTZ2Q/GK4BWBEtj2+ifeOpa342IivopCH05YVQOx9bpsdFHMYaalMDCwvr2lfVns8aTcpJ3z9uE8wLdKWTyiinT7nuLX6RuPwhXB2proBRH1wrGSIUgcVcizkWn8QizD8LlsGFcHIQJkmq+sJz6r7cCZLIfS6hdAzI+hYbJie6n/agwfxe4r+mbXsmmC6ALKKk7CEnaiNnDg0fgTaUfBPwSfu+JmVrjdSO+S8f/CMbtYeO6QknOxhLV9oK6knszv7nLlSYXTzXanHkN4Y0fW3dsSvoE+qDR0YijbbT8slqMd6z95wWVDFUmTcN8Nzk8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILci1PI4hoB56+xxS5gSMKceuJ/dv6t7etpmtENwoSFr#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJIaOLr2ntjSUcigXC7a0sFoonsuh0ChCx2a1R6G8EDmJ8/ZB8NEiJE6KAQJDNU5XsXjuaC44eJhOUMRK9r98xA=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2GXKCQiCwQEMihcSwDVeJtG2CpTemmA6MTbtOkxbB3OAV5PK8v8imPvDGMDurfGFQG0RzWyv9szlMJXdgIkwejIfy/AY7p6nemHOpu6DdAx0EA/jg1YcOIeeEhyMw1/oFzjYClGMohaI1oTKHtR29UXWphTAroOkf26Exvco6hh2ApRTXV9ObzSoOyCC7+OZcOWgYzdoCfu/0FDGkH2ksKLQS7d4AAh/XZ/njXhK57U7ptxHCReUPECGRv7KB4f8TelZDAIeUyp7ngd/9ivUDO1zue1Qr9ECzTzAFqippGXFmYl3+oSid03CY7bqnxav4xWt7UukbaO57goyIPfkklPdC1kA7kZqa9bqeDU1WgDkqnLu8hluArB0Y0Jz+hDfx9pTbAL6MklraoLaGrnrgcibAollAN+7WGqdWxUotENYaljO7P1Z18MlNllWFzk4Le5jMLNL8qArSlzM+ufOThnLdGEuYZhH1x969AisGQ4MQWn0P0lZFu6fE5VSNA/k=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDdPWx5WoFJTxz6PiFZL5f3XrtE682RjGFiIpoe0LXZO#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQlZMweHfLYiJFtm1r2tQze/oNx6KzgaXkK+Kof7POk0cFMLbTsXU8qgbQMh4o5LVO0Hbas4mAqxRkGcFCg2Po=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUVpPatup3d17omeiTdJaYR8jCcDbraJSPBxWy49Wxst4G+6/lD41HVIKmjgCgIbbmYSFBPQmoXt4gFXP4FRKna6AbQWi0kwF3/T2biQ2qCid0HVDSS8YRVlyrpdVc1/bIg6YNLkGnhzOMp0S1443+cg5PqutAbrAT1LOg6lSBu+K9gIqJ4un3l2guSweoyba5UhMyjrq4Pffx1QCuBggtYSjmA9Q1r5VVNc2J7AbP0QuzOe6J6DhpdGJsfmHDVXZb/4b/aPUdCTKkLseyUtcqElWVhhnGnpYSJdN81ejalSktGHE4JRHih19wwTokiKvoczUgijBzOfl+kt2ELcpDgzpzY0M9yd0Zz7wrK4rLM6hi8x3LYZXZv8N7KnawUcJ2jfzilx1BVLdNzgwDNB7ZlP4O9Vs3fKnBufCUFPNcRyWl6ooczepbgxqgSbr/Ham2O4/qzvJmzLtu0KxBkaFALRWnyM39nYVE/jrMKJ5ihtVDxIY9FGma/Jifg15gqI0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN19pK3a7AH/OiwlqJTVWP/qzU/QzkC16s4D1xY1Vn6J#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLsXsjJNPVMX1YVTe2oBmcZpUSiv3HOeuICgZtQun4hTopMXH9dE1jQeUruGwqZ+NsKW6X2bLZZJ0/tcn2owL8Q=#012 create=True mode=0644 path=/tmp/ansible.oiwd6zah state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:31 np0005539509 python3.9[69069]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.oiwd6zah' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:32 np0005539509 python3.9[69223]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.oiwd6zah state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:32 np0005539509 systemd[1]: session-16.scope: Deactivated successfully.
Nov 29 01:14:33 np0005539509 systemd[1]: session-16.scope: Consumed 4.297s CPU time.
Nov 29 01:14:33 np0005539509 systemd-logind[785]: Session 16 logged out. Waiting for processes to exit.
Nov 29 01:14:33 np0005539509 systemd-logind[785]: Removed session 16.
Nov 29 01:14:39 np0005539509 systemd-logind[785]: New session 17 of user zuul.
Nov 29 01:14:39 np0005539509 systemd[1]: Started Session 17 of User zuul.
Nov 29 01:14:40 np0005539509 python3.9[69403]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:14:41 np0005539509 python3.9[69559]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 01:14:42 np0005539509 python3.9[69713]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:14:44 np0005539509 python3.9[69866]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:45 np0005539509 python3.9[70019]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:14:46 np0005539509 python3.9[70173]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:47 np0005539509 python3.9[70328]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:47 np0005539509 systemd-logind[785]: Session 17 logged out. Waiting for processes to exit.
Nov 29 01:14:47 np0005539509 systemd[1]: session-17.scope: Deactivated successfully.
Nov 29 01:14:47 np0005539509 systemd[1]: session-17.scope: Consumed 5.478s CPU time.
Nov 29 01:14:47 np0005539509 systemd-logind[785]: Removed session 17.
Nov 29 01:14:53 np0005539509 systemd-logind[785]: New session 18 of user zuul.
Nov 29 01:14:53 np0005539509 systemd[1]: Started Session 18 of User zuul.
Nov 29 01:14:54 np0005539509 python3.9[70506]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:14:55 np0005539509 python3.9[70662]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:14:56 np0005539509 python3.9[70746]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:14:58 np0005539509 python3.9[70897]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:15:00 np0005539509 python3.9[71048]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:15:00 np0005539509 python3.9[71198]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:15:00 np0005539509 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:15:00 np0005539509 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:15:01 np0005539509 python3.9[71349]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:15:02 np0005539509 systemd[1]: session-18.scope: Deactivated successfully.
Nov 29 01:15:02 np0005539509 systemd[1]: session-18.scope: Consumed 6.848s CPU time.
Nov 29 01:15:02 np0005539509 systemd-logind[785]: Session 18 logged out. Waiting for processes to exit.
Nov 29 01:15:02 np0005539509 systemd-logind[785]: Removed session 18.
Nov 29 01:15:09 np0005539509 chronyd[58583]: Selected source 142.4.192.253 (pool.ntp.org)
Nov 29 01:15:11 np0005539509 systemd-logind[785]: New session 19 of user zuul.
Nov 29 01:15:11 np0005539509 systemd[1]: Started Session 19 of User zuul.
Nov 29 01:15:18 np0005539509 python3[72116]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:15:20 np0005539509 python3[72211]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 29 01:15:21 np0005539509 python3[72238]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 29 01:15:22 np0005539509 python3[72264]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:15:22 np0005539509 kernel: loop: module loaded
Nov 29 01:15:22 np0005539509 kernel: loop3: detected capacity change from 0 to 14680064
Nov 29 01:15:22 np0005539509 python3[72299]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:15:22 np0005539509 lvm[72302]: PV /dev/loop3 not used.
Nov 29 01:15:22 np0005539509 lvm[72304]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 01:15:22 np0005539509 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Nov 29 01:15:23 np0005539509 lvm[72314]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 01:15:23 np0005539509 lvm[72314]: VG ceph_vg0 finished
Nov 29 01:15:23 np0005539509 lvm[72312]:  1 logical volume(s) in volume group "ceph_vg0" now active
Nov 29 01:15:23 np0005539509 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Nov 29 01:15:24 np0005539509 python3[72392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:15:24 np0005539509 python3[72465]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764396923.9252074-37029-138445971144281/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:15:25 np0005539509 python3[72515]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:15:25 np0005539509 systemd[1]: Reloading.
Nov 29 01:15:25 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:15:25 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:15:26 np0005539509 systemd[1]: Starting Ceph OSD losetup...
Nov 29 01:15:26 np0005539509 bash[72556]: /dev/loop3: [64513]:4194937 (/var/lib/ceph-osd-0.img)
Nov 29 01:15:26 np0005539509 systemd[1]: Finished Ceph OSD losetup.
Nov 29 01:15:26 np0005539509 lvm[72560]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 01:15:26 np0005539509 lvm[72560]: VG ceph_vg0 finished
Nov 29 01:15:28 np0005539509 python3[72586]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:29 np0005539509 systemd-logind[785]: New session 20 of user ceph-admin.
Nov 29 01:17:29 np0005539509 systemd[1]: Created slice User Slice of UID 42477.
Nov 29 01:17:29 np0005539509 systemd[1]: Starting User Runtime Directory /run/user/42477...
Nov 29 01:17:29 np0005539509 systemd[1]: Finished User Runtime Directory /run/user/42477.
Nov 29 01:17:30 np0005539509 systemd[1]: Starting User Manager for UID 42477...
Nov 29 01:17:30 np0005539509 systemd-logind[785]: New session 22 of user ceph-admin.
Nov 29 01:17:30 np0005539509 systemd[72642]: Queued start job for default target Main User Target.
Nov 29 01:17:30 np0005539509 systemd[72642]: Created slice User Application Slice.
Nov 29 01:17:30 np0005539509 systemd[72642]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:17:30 np0005539509 systemd[72642]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:17:30 np0005539509 systemd[72642]: Reached target Paths.
Nov 29 01:17:30 np0005539509 systemd[72642]: Reached target Timers.
Nov 29 01:17:30 np0005539509 systemd[72642]: Starting D-Bus User Message Bus Socket...
Nov 29 01:17:30 np0005539509 systemd[72642]: Starting Create User's Volatile Files and Directories...
Nov 29 01:17:30 np0005539509 systemd[72642]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:17:30 np0005539509 systemd[72642]: Reached target Sockets.
Nov 29 01:17:30 np0005539509 systemd[72642]: Finished Create User's Volatile Files and Directories.
Nov 29 01:17:30 np0005539509 systemd[72642]: Reached target Basic System.
Nov 29 01:17:30 np0005539509 systemd[72642]: Reached target Main User Target.
Nov 29 01:17:30 np0005539509 systemd[72642]: Startup finished in 171ms.
Nov 29 01:17:30 np0005539509 systemd[1]: Started User Manager for UID 42477.
Nov 29 01:17:30 np0005539509 systemd[1]: Started Session 20 of User ceph-admin.
Nov 29 01:17:30 np0005539509 systemd[1]: Started Session 22 of User ceph-admin.
Nov 29 01:17:30 np0005539509 systemd-logind[785]: New session 23 of user ceph-admin.
Nov 29 01:17:30 np0005539509 systemd[1]: Started Session 23 of User ceph-admin.
Nov 29 01:17:31 np0005539509 systemd-logind[785]: New session 24 of user ceph-admin.
Nov 29 01:17:31 np0005539509 systemd[1]: Started Session 24 of User ceph-admin.
Nov 29 01:17:31 np0005539509 systemd-logind[785]: New session 25 of user ceph-admin.
Nov 29 01:17:31 np0005539509 systemd[1]: Started Session 25 of User ceph-admin.
Nov 29 01:17:31 np0005539509 systemd-logind[785]: New session 26 of user ceph-admin.
Nov 29 01:17:31 np0005539509 systemd[1]: Started Session 26 of User ceph-admin.
Nov 29 01:17:32 np0005539509 systemd-logind[785]: New session 27 of user ceph-admin.
Nov 29 01:17:32 np0005539509 systemd[1]: Started Session 27 of User ceph-admin.
Nov 29 01:17:32 np0005539509 systemd-logind[785]: New session 28 of user ceph-admin.
Nov 29 01:17:32 np0005539509 systemd[1]: Started Session 28 of User ceph-admin.
Nov 29 01:17:33 np0005539509 systemd-logind[785]: New session 29 of user ceph-admin.
Nov 29 01:17:33 np0005539509 systemd[1]: Started Session 29 of User ceph-admin.
Nov 29 01:17:33 np0005539509 systemd-logind[785]: New session 30 of user ceph-admin.
Nov 29 01:17:33 np0005539509 systemd[1]: Started Session 30 of User ceph-admin.
Nov 29 01:17:34 np0005539509 systemd-logind[785]: New session 31 of user ceph-admin.
Nov 29 01:17:34 np0005539509 systemd[1]: Started Session 31 of User ceph-admin.
Nov 29 01:17:34 np0005539509 systemd-logind[785]: New session 32 of user ceph-admin.
Nov 29 01:17:34 np0005539509 systemd[1]: Started Session 32 of User ceph-admin.
Nov 29 01:17:35 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:17:35 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:17:36 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:17:36 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:17:37 np0005539509 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73618 (sysctl)
Nov 29 01:17:37 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:17:37 np0005539509 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 29 01:17:37 np0005539509 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 29 01:17:38 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:17:38 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:17:38 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:17:46 np0005539509 podman[73895]: 2025-11-29 06:17:46.990807249 +0000 UTC m=+8.036427383 image pull-error  quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0 unable to copy from source docker://quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0: copying system image from manifest list: reading blob sha256:5d6e359445031266a061adaf2d66bc7e110161eb2d4cc1c20df0b7b391e2e65a: Get "https://cdn01.quay.io/quayio-production-s3/sha256/5d/5d6e359445031266a061adaf2d66bc7e110161eb2d4cc1c20df0b7b391e2e65a?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251129%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251129T061739Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=f2c8d0740ca5e417432f59f87b3e1a078374b58c1ece456a50051e5535d22395&region=us-east-1&namespace=ceph&repo_name=ceph&akamai_signature=exp=1764397959~hmac=489e02c9402b7ae0da17693faba5808124f30e4a58d26632cd42db255b5697df": remote error: tls: internal error
Nov 29 01:17:46 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:17:52 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:17:52 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:17:54 np0005539509 systemd[1]: var-lib-containers-storage-overlay-compat3728586434-merged.mount: Deactivated successfully.
Nov 29 01:17:54 np0005539509 systemd[1]: var-lib-containers-storage-overlay-compat3728586434-lower\x2dmapped.mount: Deactivated successfully.
Nov 29 01:18:16 np0005539509 podman[75746]: 2025-11-29 06:18:16.408385917 +0000 UTC m=+23.966234202 container create f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Nov 29 01:18:16 np0005539509 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck3820584154-merged.mount: Deactivated successfully.
Nov 29 01:18:16 np0005539509 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 29 01:18:16 np0005539509 systemd[1]: Started libpod-conmon-f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96.scope.
Nov 29 01:18:16 np0005539509 podman[75746]: 2025-11-29 06:18:16.388224989 +0000 UTC m=+23.946073294 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:16 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:16 np0005539509 podman[75746]: 2025-11-29 06:18:16.526671639 +0000 UTC m=+24.084519984 container init f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default)
Nov 29 01:18:16 np0005539509 podman[75746]: 2025-11-29 06:18:16.538702721 +0000 UTC m=+24.096551046 container start f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:18:16 np0005539509 podman[75746]: 2025-11-29 06:18:16.543640118 +0000 UTC m=+24.101488443 container attach f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:18:16 np0005539509 reverent_keller[75815]: 167 167
Nov 29 01:18:16 np0005539509 systemd[1]: libpod-f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96.scope: Deactivated successfully.
Nov 29 01:18:16 np0005539509 podman[75746]: 2025-11-29 06:18:16.549063918 +0000 UTC m=+24.106912223 container died f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:18:16 np0005539509 systemd[1]: var-lib-containers-storage-overlay-f5cad89f6770416eba1342b8ee95aa48dbbe36647ee73ef4b65e74454b2515db-merged.mount: Deactivated successfully.
Nov 29 01:18:16 np0005539509 podman[75746]: 2025-11-29 06:18:16.600769508 +0000 UTC m=+24.158617803 container remove f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=reverent_keller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:18:16 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:18:16 np0005539509 systemd[1]: libpod-conmon-f6b19f24a828c441af93903b14611058957359113ef666986bd07f2066d8bc96.scope: Deactivated successfully.
Nov 29 01:18:16 np0005539509 systemd[1]: Reloading.
Nov 29 01:18:16 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:18:16 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:16 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:18:16 np0005539509 systemd[1]: Reloading.
Nov 29 01:18:17 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:17 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:18:17 np0005539509 systemd[1]: Reached target All Ceph clusters and services.
Nov 29 01:18:17 np0005539509 systemd[1]: Reloading.
Nov 29 01:18:17 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:18:17 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:17 np0005539509 systemd[1]: Reached target Ceph cluster 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:18:17 np0005539509 systemd[1]: Reloading.
Nov 29 01:18:17 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:18:17 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:17 np0005539509 systemd[1]: Reloading.
Nov 29 01:18:17 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:17 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:18:18 np0005539509 systemd[1]: Created slice Slice /system/ceph-336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:18:18 np0005539509 systemd[1]: Reached target System Time Set.
Nov 29 01:18:18 np0005539509 systemd[1]: Reached target System Time Synchronized.
Nov 29 01:18:18 np0005539509 systemd[1]: Starting Ceph crash.compute-1 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:18:18 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:18:18 np0005539509 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:18:18 np0005539509 podman[76076]: 2025-11-29 06:18:18.389064124 +0000 UTC m=+0.057101141 container create 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Nov 29 01:18:18 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f781cdad64ba65b454ab70e9b45e6bf8cfd68f347c8f32c578688a8e4d7b89b7/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:18 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f781cdad64ba65b454ab70e9b45e6bf8cfd68f347c8f32c578688a8e4d7b89b7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:18 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f781cdad64ba65b454ab70e9b45e6bf8cfd68f347c8f32c578688a8e4d7b89b7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:18 np0005539509 podman[76076]: 2025-11-29 06:18:18.448740364 +0000 UTC m=+0.116777401 container init 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 01:18:18 np0005539509 podman[76076]: 2025-11-29 06:18:18.453187868 +0000 UTC m=+0.121224855 container start 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:18:18 np0005539509 bash[76076]: 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6
Nov 29 01:18:18 np0005539509 podman[76076]: 2025-11-29 06:18:18.369091641 +0000 UTC m=+0.037128628 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:18 np0005539509 systemd[1]: Started Ceph crash.compute-1 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:18:18 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: INFO:ceph-crash:pinging cluster to exercise our key
Nov 29 01:18:18 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.851+0000 7fd226a7b640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 29 01:18:18 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.851+0000 7fd226a7b640 -1 AuthRegistry(0x7fd220066fe0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 29 01:18:18 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.853+0000 7fd226a7b640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 29 01:18:18 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.853+0000 7fd226a7b640 -1 AuthRegistry(0x7fd226a7a000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 29 01:18:18 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.856+0000 7fd21ffff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 29 01:18:18 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: 2025-11-29T06:18:18.856+0000 7fd226a7b640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Nov 29 01:18:18 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: [errno 13] RADOS permission denied (error connecting to the cluster)
Nov 29 01:18:18 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1[76091]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Nov 29 01:18:19 np0005539509 podman[76247]: 2025-11-29 06:18:19.117693068 +0000 UTC m=+0.055146186 container create eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:18:19 np0005539509 systemd[1]: Started libpod-conmon-eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd.scope.
Nov 29 01:18:19 np0005539509 podman[76247]: 2025-11-29 06:18:19.086899716 +0000 UTC m=+0.024352884 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:19 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:19 np0005539509 podman[76247]: 2025-11-29 06:18:19.217589601 +0000 UTC m=+0.155042719 container init eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:18:19 np0005539509 podman[76247]: 2025-11-29 06:18:19.231745753 +0000 UTC m=+0.169198871 container start eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 29 01:18:19 np0005539509 podman[76247]: 2025-11-29 06:18:19.236317129 +0000 UTC m=+0.173770217 container attach eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:18:19 np0005539509 keen_cori[76263]: 167 167
Nov 29 01:18:19 np0005539509 systemd[1]: libpod-eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd.scope: Deactivated successfully.
Nov 29 01:18:19 np0005539509 conmon[76263]: conmon eecf89c619bd173bf00c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd.scope/container/memory.events
Nov 29 01:18:19 np0005539509 podman[76247]: 2025-11-29 06:18:19.239883467 +0000 UTC m=+0.177336585 container died eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:18:19 np0005539509 systemd[1]: var-lib-containers-storage-overlay-1c0c7a7c932ef8ca8f43a9e68cec7183d38a0b38892b0d000e58a72a4d9ad42d-merged.mount: Deactivated successfully.
Nov 29 01:18:19 np0005539509 podman[76247]: 2025-11-29 06:18:19.294463997 +0000 UTC m=+0.231917115 container remove eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=keen_cori, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:18:19 np0005539509 systemd[1]: libpod-conmon-eecf89c619bd173bf00ca274ce91836f2e1546ca65d348d5b8176f60f241a7cd.scope: Deactivated successfully.
Nov 29 01:18:19 np0005539509 podman[76288]: 2025-11-29 06:18:19.541726917 +0000 UTC m=+0.079134691 container create 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:18:19 np0005539509 systemd[1]: Started libpod-conmon-4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2.scope.
Nov 29 01:18:19 np0005539509 podman[76288]: 2025-11-29 06:18:19.506676367 +0000 UTC m=+0.044084181 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:19 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:19 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:19 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:19 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:19 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:19 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:19 np0005539509 podman[76288]: 2025-11-29 06:18:19.653484318 +0000 UTC m=+0.190892122 container init 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Nov 29 01:18:19 np0005539509 podman[76288]: 2025-11-29 06:18:19.668669589 +0000 UTC m=+0.206077363 container start 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:18:19 np0005539509 podman[76288]: 2025-11-29 06:18:19.674691865 +0000 UTC m=+0.212099629 container attach 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 29 01:18:20 np0005539509 stupefied_noether[76304]: --> passed data devices: 0 physical, 1 LVM
Nov 29 01:18:20 np0005539509 stupefied_noether[76304]: --> relative data size: 1.0
Nov 29 01:18:20 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 29 01:18:20 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new f793b967-de22-4105-bb0d-c91464bf150f
Nov 29 01:18:21 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 29 01:18:21 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Nov 29 01:18:21 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Nov 29 01:18:21 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 01:18:21 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:21 np0005539509 lvm[76352]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 01:18:21 np0005539509 lvm[76352]: VG ceph_vg0 finished
Nov 29 01:18:21 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Nov 29 01:18:21 np0005539509 stupefied_noether[76304]: stderr: got monmap epoch 1
Nov 29 01:18:21 np0005539509 stupefied_noether[76304]: --> Creating keyring file for osd.0
Nov 29 01:18:21 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Nov 29 01:18:21 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Nov 29 01:18:21 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid f793b967-de22-4105-bb0d-c91464bf150f --setuser ceph --setgroup ceph
Nov 29 01:18:24 np0005539509 stupefied_noether[76304]: stderr: 2025-11-29T06:18:21.826+0000 7fb6d587f740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 01:18:24 np0005539509 stupefied_noether[76304]: stderr: 2025-11-29T06:18:21.827+0000 7fb6d587f740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 01:18:24 np0005539509 stupefied_noether[76304]: stderr: 2025-11-29T06:18:21.827+0000 7fb6d587f740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 01:18:24 np0005539509 stupefied_noether[76304]: stderr: 2025-11-29T06:18:21.827+0000 7fb6d587f740 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Nov 29 01:18:24 np0005539509 stupefied_noether[76304]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Nov 29 01:18:25 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 01:18:25 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Nov 29 01:18:25 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:25 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:25 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 01:18:25 np0005539509 stupefied_noether[76304]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 01:18:25 np0005539509 stupefied_noether[76304]: --> ceph-volume lvm activate successful for osd ID: 0
Nov 29 01:18:25 np0005539509 stupefied_noether[76304]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Nov 29 01:18:25 np0005539509 systemd[1]: libpod-4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2.scope: Deactivated successfully.
Nov 29 01:18:25 np0005539509 systemd[1]: libpod-4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2.scope: Consumed 2.543s CPU time.
Nov 29 01:18:25 np0005539509 podman[76288]: 2025-11-29 06:18:25.17266064 +0000 UTC m=+5.710068424 container died 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:18:25 np0005539509 systemd[1]: var-lib-containers-storage-overlay-6596c10cdc1392fd8e28adbe0e35b3546efcb06580524b7bb27a3cf84b35a260-merged.mount: Deactivated successfully.
Nov 29 01:18:26 np0005539509 podman[76288]: 2025-11-29 06:18:26.013482478 +0000 UTC m=+6.550890252 container remove 4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stupefied_noether, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:18:26 np0005539509 systemd[1]: libpod-conmon-4db4b3c293713ed6628bd007117ef670e4921a99e2584d967f8ecdfcd2f33af2.scope: Deactivated successfully.
Nov 29 01:18:26 np0005539509 podman[77424]: 2025-11-29 06:18:26.743262048 +0000 UTC m=+0.026669833 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:26 np0005539509 podman[77424]: 2025-11-29 06:18:26.884457687 +0000 UTC m=+0.167865472 container create 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef)
Nov 29 01:18:26 np0005539509 systemd[1]: Started libpod-conmon-730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874.scope.
Nov 29 01:18:26 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:27 np0005539509 podman[77424]: 2025-11-29 06:18:27.005128489 +0000 UTC m=+0.288536284 container init 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507)
Nov 29 01:18:27 np0005539509 podman[77424]: 2025-11-29 06:18:27.018608139 +0000 UTC m=+0.302015924 container start 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Nov 29 01:18:27 np0005539509 recursing_herschel[77441]: 167 167
Nov 29 01:18:27 np0005539509 systemd[1]: libpod-730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874.scope: Deactivated successfully.
Nov 29 01:18:27 np0005539509 podman[77424]: 2025-11-29 06:18:27.098653638 +0000 UTC m=+0.382061463 container attach 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507)
Nov 29 01:18:27 np0005539509 podman[77424]: 2025-11-29 06:18:27.099444046 +0000 UTC m=+0.382851831 container died 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:18:27 np0005539509 systemd[1]: var-lib-containers-storage-overlay-021639303c70fff0e21c2d8bfb8c23dd309bdd38bbb3c9f264dbcbee9cf71e30-merged.mount: Deactivated successfully.
Nov 29 01:18:27 np0005539509 podman[77424]: 2025-11-29 06:18:27.510664496 +0000 UTC m=+0.794072271 container remove 730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_herschel, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Nov 29 01:18:27 np0005539509 systemd[1]: libpod-conmon-730f3a1544c2df6cd9b2570329007d46c616308c17c0772701707ecdad3d6874.scope: Deactivated successfully.
Nov 29 01:18:27 np0005539509 podman[77468]: 2025-11-29 06:18:27.773673362 +0000 UTC m=+0.092352134 container create 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:18:27 np0005539509 podman[77468]: 2025-11-29 06:18:27.709200589 +0000 UTC m=+0.027879381 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:27 np0005539509 systemd[1]: Started libpod-conmon-140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1.scope.
Nov 29 01:18:27 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:27 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c99024dbff69317d78cf0765d176e8f92aa5e050e65b73caace64ceee49dd7a8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:27 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c99024dbff69317d78cf0765d176e8f92aa5e050e65b73caace64ceee49dd7a8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:27 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c99024dbff69317d78cf0765d176e8f92aa5e050e65b73caace64ceee49dd7a8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:27 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c99024dbff69317d78cf0765d176e8f92aa5e050e65b73caace64ceee49dd7a8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:27 np0005539509 podman[77468]: 2025-11-29 06:18:27.935380966 +0000 UTC m=+0.254059798 container init 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Nov 29 01:18:27 np0005539509 podman[77468]: 2025-11-29 06:18:27.949232684 +0000 UTC m=+0.267911456 container start 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Nov 29 01:18:27 np0005539509 podman[77468]: 2025-11-29 06:18:27.953209703 +0000 UTC m=+0.271888465 container attach 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]: {
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:    "0": [
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:        {
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            "devices": [
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "/dev/loop3"
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            ],
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            "lv_name": "ceph_lv0",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            "lv_size": "7511998464",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=336ec58c-893b-528f-a0c1-6ed1196bc047,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=f793b967-de22-4105-bb0d-c91464bf150f,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            "lv_uuid": "caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            "name": "ceph_lv0",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            "path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            "tags": {
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.block_uuid": "caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.cephx_lockbox_secret": "",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.cluster_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.cluster_name": "ceph",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.crush_device_class": "",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.encrypted": "0",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.osd_fsid": "f793b967-de22-4105-bb0d-c91464bf150f",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.osd_id": "0",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.osdspec_affinity": "default_drive_group",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.type": "block",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:                "ceph.vdo": "0"
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            },
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            "type": "block",
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:            "vg_name": "ceph_vg0"
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:        }
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]:    ]
Nov 29 01:18:28 np0005539509 trusting_bassi[77484]: }
Nov 29 01:18:28 np0005539509 systemd[1]: libpod-140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1.scope: Deactivated successfully.
Nov 29 01:18:28 np0005539509 podman[77468]: 2025-11-29 06:18:28.781945642 +0000 UTC m=+1.100624384 container died 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Nov 29 01:18:29 np0005539509 systemd[1]: var-lib-containers-storage-overlay-c99024dbff69317d78cf0765d176e8f92aa5e050e65b73caace64ceee49dd7a8-merged.mount: Deactivated successfully.
Nov 29 01:18:29 np0005539509 podman[77468]: 2025-11-29 06:18:29.259051257 +0000 UTC m=+1.577729999 container remove 140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=trusting_bassi, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Nov 29 01:18:29 np0005539509 systemd[1]: libpod-conmon-140a060640c2fe61a7000d2309191848cd0ea61c73accbd4ea670065b62723f1.scope: Deactivated successfully.
Nov 29 01:18:30 np0005539509 podman[77646]: 2025-11-29 06:18:30.098540136 +0000 UTC m=+0.032022523 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:30 np0005539509 podman[77646]: 2025-11-29 06:18:30.329569421 +0000 UTC m=+0.263051748 container create 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef)
Nov 29 01:18:30 np0005539509 systemd[1]: Started libpod-conmon-00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15.scope.
Nov 29 01:18:30 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:30 np0005539509 podman[77646]: 2025-11-29 06:18:30.506948074 +0000 UTC m=+0.440430371 container init 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:18:30 np0005539509 podman[77646]: 2025-11-29 06:18:30.515248918 +0000 UTC m=+0.448731205 container start 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Nov 29 01:18:30 np0005539509 boring_leakey[77663]: 167 167
Nov 29 01:18:30 np0005539509 systemd[1]: libpod-00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15.scope: Deactivated successfully.
Nov 29 01:18:30 np0005539509 podman[77646]: 2025-11-29 06:18:30.565568046 +0000 UTC m=+0.499050363 container attach 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:18:30 np0005539509 podman[77646]: 2025-11-29 06:18:30.566113419 +0000 UTC m=+0.499595736 container died 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:18:30 np0005539509 systemd[1]: var-lib-containers-storage-overlay-39a1f0a4234a2a43258903052215cc9e4841dbfd9c0e17576e3af253c4269edf-merged.mount: Deactivated successfully.
Nov 29 01:18:30 np0005539509 podman[77646]: 2025-11-29 06:18:30.615376584 +0000 UTC m=+0.548858891 container remove 00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_leakey, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:18:30 np0005539509 systemd[1]: libpod-conmon-00417ad76c92009e3e3ec7381344462272c0b74c4902ad4d69e11da6f2843c15.scope: Deactivated successfully.
Nov 29 01:18:30 np0005539509 podman[77696]: 2025-11-29 06:18:30.964851972 +0000 UTC m=+0.052864737 container create 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 29 01:18:31 np0005539509 systemd[1]: Started libpod-conmon-89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237.scope.
Nov 29 01:18:31 np0005539509 podman[77696]: 2025-11-29 06:18:30.945050691 +0000 UTC m=+0.033063466 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:31 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:31 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:31 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:31 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:31 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:31 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:31 np0005539509 podman[77696]: 2025-11-29 06:18:31.090032584 +0000 UTC m=+0.178045369 container init 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Nov 29 01:18:31 np0005539509 podman[77696]: 2025-11-29 06:18:31.106386218 +0000 UTC m=+0.194398993 container start 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:18:31 np0005539509 podman[77696]: 2025-11-29 06:18:31.110982009 +0000 UTC m=+0.198994774 container attach 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:18:31 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test[77713]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Nov 29 01:18:31 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test[77713]:                            [--no-systemd] [--no-tmpfs]
Nov 29 01:18:31 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test[77713]: ceph-volume activate: error: unrecognized arguments: --bad-option
Nov 29 01:18:31 np0005539509 systemd[1]: libpod-89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237.scope: Deactivated successfully.
Nov 29 01:18:31 np0005539509 podman[77696]: 2025-11-29 06:18:31.808104904 +0000 UTC m=+0.896117669 container died 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS)
Nov 29 01:18:31 np0005539509 systemd[1]: var-lib-containers-storage-overlay-6b44f967915d7a3c71e925ca71f96a9ec2536b6befff91f27ab01943aa010520-merged.mount: Deactivated successfully.
Nov 29 01:18:31 np0005539509 podman[77696]: 2025-11-29 06:18:31.876810381 +0000 UTC m=+0.964823126 container remove 89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:18:31 np0005539509 systemd[1]: libpod-conmon-89745559b846bcf73fe56addd0c79b197129a88a3c507d9ecf07ef3407ea9237.scope: Deactivated successfully.
Nov 29 01:18:32 np0005539509 systemd[1]: Reloading.
Nov 29 01:18:32 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:18:32 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:32 np0005539509 systemd[1]: Reloading.
Nov 29 01:18:32 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:32 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:18:32 np0005539509 systemd[1]: Starting Ceph osd.0 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:18:33 np0005539509 podman[77873]: 2025-11-29 06:18:33.056007701 +0000 UTC m=+0.061149881 container create 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:18:33 np0005539509 podman[77873]: 2025-11-29 06:18:33.030492994 +0000 UTC m=+0.035635184 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:33 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:33 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:33 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:33 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:33 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:33 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:33 np0005539509 podman[77873]: 2025-11-29 06:18:33.152815963 +0000 UTC m=+0.157958183 container init 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:18:33 np0005539509 podman[77873]: 2025-11-29 06:18:33.166427895 +0000 UTC m=+0.171570035 container start 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 01:18:33 np0005539509 podman[77873]: 2025-11-29 06:18:33.170900815 +0000 UTC m=+0.176042985 container attach 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Nov 29 01:18:34 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 01:18:34 np0005539509 bash[77873]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 01:18:34 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 01:18:34 np0005539509 bash[77873]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 01:18:34 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 01:18:34 np0005539509 bash[77873]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 01:18:34 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 01:18:34 np0005539509 bash[77873]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 01:18:34 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:34 np0005539509 bash[77873]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:34 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 01:18:34 np0005539509 bash[77873]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 29 01:18:34 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate[77888]: --> ceph-volume raw activate successful for osd ID: 0
Nov 29 01:18:34 np0005539509 bash[77873]: --> ceph-volume raw activate successful for osd ID: 0
Nov 29 01:18:34 np0005539509 systemd[1]: libpod-736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a.scope: Deactivated successfully.
Nov 29 01:18:34 np0005539509 systemd[1]: libpod-736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a.scope: Consumed 1.103s CPU time.
Nov 29 01:18:34 np0005539509 podman[78008]: 2025-11-29 06:18:34.297774501 +0000 UTC m=+0.029999257 container died 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Nov 29 01:18:34 np0005539509 systemd[1]: var-lib-containers-storage-overlay-7eeb8af09b8531506802469ccbd33f189b7f8ea7004c5f6e54790d98f093963d-merged.mount: Deactivated successfully.
Nov 29 01:18:34 np0005539509 podman[78008]: 2025-11-29 06:18:34.367642485 +0000 UTC m=+0.099867141 container remove 736042499a598550435c0dc09047327423b248e859d84a8cfc95f8ed031e472a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0-activate, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Nov 29 01:18:34 np0005539509 podman[78069]: 2025-11-29 06:18:34.602058104 +0000 UTC m=+0.047562398 container create 142ead126c9af06537b8a836b8bfc0c1c7aa78776cf87cddc2c00f33cd39daba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Nov 29 01:18:34 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a1ab998af2835240f03fd4bff908c647865d353b867edf3d8f61e9f5621b998/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:34 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a1ab998af2835240f03fd4bff908c647865d353b867edf3d8f61e9f5621b998/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:34 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a1ab998af2835240f03fd4bff908c647865d353b867edf3d8f61e9f5621b998/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:34 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a1ab998af2835240f03fd4bff908c647865d353b867edf3d8f61e9f5621b998/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:34 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a1ab998af2835240f03fd4bff908c647865d353b867edf3d8f61e9f5621b998/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:34 np0005539509 podman[78069]: 2025-11-29 06:18:34.582731225 +0000 UTC m=+0.028235519 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:34 np0005539509 podman[78069]: 2025-11-29 06:18:34.697731742 +0000 UTC m=+0.143236036 container init 142ead126c9af06537b8a836b8bfc0c1c7aa78776cf87cddc2c00f33cd39daba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Nov 29 01:18:34 np0005539509 podman[78069]: 2025-11-29 06:18:34.714417012 +0000 UTC m=+0.159921286 container start 142ead126c9af06537b8a836b8bfc0c1c7aa78776cf87cddc2c00f33cd39daba (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:18:34 np0005539509 bash[78069]: 142ead126c9af06537b8a836b8bfc0c1c7aa78776cf87cddc2c00f33cd39daba
Nov 29 01:18:34 np0005539509 systemd[1]: Started Ceph osd.0 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: pidfile_write: ignore empty --pid-file
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: bdev(0x5566c6f69800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: bdev(0x5566c6f69800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: bdev(0x5566c6f69800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: bdev(0x5566c6f69800 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: bdev(0x5566c7da1800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: bdev(0x5566c7da1800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: bdev(0x5566c7da1800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: bdev(0x5566c7da1800 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Nov 29 01:18:34 np0005539509 ceph-osd[78089]: bdev(0x5566c7da1800 /var/lib/ceph/osd/ceph-0/block) close
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6f69800 /var/lib/ceph/osd/ceph-0/block) close
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: load: jerasure load: lrc 
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) close
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) close
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd8c00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluefs mount
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluefs mount shared_bdev_used = 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: RocksDB version: 7.9.2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Git sha 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: DB SUMMARY
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: DB Session ID:  YT76S9WB35YQ4FZZK94N
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: CURRENT file:  CURRENT
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                         Options.error_if_exists: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.create_if_missing: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                                     Options.env: 0x5566c7df3d50
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                                Options.info_log: 0x5566c6ff4ba0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                              Options.statistics: (nil)
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.use_fsync: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                              Options.db_log_dir: 
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                                 Options.wal_dir: db.wal
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.write_buffer_manager: 0x5566c7f04460
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.unordered_write: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.row_cache: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                              Options.wal_filter: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.two_write_queues: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.wal_compression: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.atomic_flush: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.max_background_jobs: 4
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.max_background_compactions: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.max_subcompactions: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.max_open_files: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Compression algorithms supported:
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: #011kZSTD supported: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: #011kXpressCompression supported: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: #011kBZip2Compression supported: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: #011kLZ4Compression supported: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: #011kZlibCompression supported: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: #011kLZ4HCCompression supported: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: #011kSnappyCompression supported: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 podman[78252]: 2025-11-29 06:18:35.883948907 +0000 UTC m=+0.125330437 container create 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:35 np0005539509 podman[78252]: 2025-11-29 06:18:35.796641576 +0000 UTC m=+0.038023146 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5220)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5200)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc2d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5200)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc2d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c6ff5200)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc2d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ba850913-8330-42fb-9d78-1800ad716abe
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397115892448, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397115892720, "job": 1, "event": "recovery_finished"}
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: freelist init
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: freelist _read_cfg
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bluefs umount
Nov 29 01:18:35 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) close
Nov 29 01:18:35 np0005539509 systemd[1]: Started libpod-conmon-7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47.scope.
Nov 29 01:18:35 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:35 np0005539509 podman[78252]: 2025-11-29 06:18:35.996888427 +0000 UTC m=+0.238269977 container init 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:18:36 np0005539509 podman[78252]: 2025-11-29 06:18:36.007214367 +0000 UTC m=+0.248595907 container start 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:18:36 np0005539509 podman[78252]: 2025-11-29 06:18:36.012649548 +0000 UTC m=+0.254031168 container attach 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:18:36 np0005539509 sharp_morse[78462]: 167 167
Nov 29 01:18:36 np0005539509 systemd[1]: libpod-7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47.scope: Deactivated successfully.
Nov 29 01:18:36 np0005539509 podman[78252]: 2025-11-29 06:18:36.01547748 +0000 UTC m=+0.256859050 container died 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Nov 29 01:18:36 np0005539509 systemd[1]: var-lib-containers-storage-overlay-4d8e2971b3b19a69b047dac4446fb5bfe70bddbfe1e367571c85c15542a21be2-merged.mount: Deactivated successfully.
Nov 29 01:18:36 np0005539509 podman[78252]: 2025-11-29 06:18:36.061849291 +0000 UTC m=+0.303230821 container remove 7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sharp_morse, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:18:36 np0005539509 systemd[1]: libpod-conmon-7242044ea9d60fb1b2a12099cc01e1f1d95e379f26abdc6e8769fe2951e04c47.scope: Deactivated successfully.
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bdev(0x5566c6fd9400 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bluefs mount
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bluefs mount shared_bdev_used = 4718592
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: RocksDB version: 7.9.2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Git sha 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: DB SUMMARY
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: DB Session ID:  YT76S9WB35YQ4FZZK94M
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: CURRENT file:  CURRENT
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                         Options.error_if_exists: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.create_if_missing: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                                     Options.env: 0x5566c7fd0460
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                                Options.info_log: 0x5566c6ff4980
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                              Options.statistics: (nil)
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.use_fsync: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                              Options.db_log_dir: 
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                                 Options.wal_dir: db.wal
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.write_buffer_manager: 0x5566c7f04460
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.unordered_write: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.row_cache: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                              Options.wal_filter: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.two_write_queues: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.wal_compression: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.atomic_flush: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.max_background_jobs: 4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.max_background_compactions: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.max_subcompactions: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.max_open_files: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Compression algorithms supported:
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: #011kZSTD supported: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: #011kXpressCompression supported: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: #011kBZip2Compression supported: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: #011kLZ4Compression supported: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: #011kZlibCompression supported: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: #011kLZ4HCCompression supported: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: #011kSnappyCompression supported: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc2d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc2d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc2d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc2d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc2d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc2d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072780)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdc2d0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072720)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072720)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:           Options.merge_operator: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5566c7072720)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5566c6fdd350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.compression: LZ4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.num_levels: 7
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ba850913-8330-42fb-9d78-1800ad716abe
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397116160664, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397116165347, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397116, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ba850913-8330-42fb-9d78-1800ad716abe", "db_session_id": "YT76S9WB35YQ4FZZK94M", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397116168047, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1593, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 467, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397116, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ba850913-8330-42fb-9d78-1800ad716abe", "db_session_id": "YT76S9WB35YQ4FZZK94M", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397116171775, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397116, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ba850913-8330-42fb-9d78-1800ad716abe", "db_session_id": "YT76S9WB35YQ4FZZK94M", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397116173098, "job": 1, "event": "recovery_finished"}
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5566c70afc00
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: DB pointer 0x5566c701fa00
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 460.80 MB usag
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: _get_class not permitted to load lua
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: _get_class not permitted to load sdk
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: _get_class not permitted to load test_remote_reads
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: osd.0 0 load_pgs
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: osd.0 0 load_pgs opened 0 pgs
Nov 29 01:18:36 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0[78085]: 2025-11-29T06:18:36.205+0000 7ff69099e740 -1 osd.0 0 log_to_monitors true
Nov 29 01:18:36 np0005539509 ceph-osd[78089]: osd.0 0 log_to_monitors true
Nov 29 01:18:36 np0005539509 podman[78669]: 2025-11-29 06:18:36.261419716 +0000 UTC m=+0.051503795 container create 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 01:18:36 np0005539509 systemd[1]: Started libpod-conmon-7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b.scope.
Nov 29 01:18:36 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:36 np0005539509 podman[78669]: 2025-11-29 06:18:36.238916696 +0000 UTC m=+0.029000855 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:36 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d81638df49d455b99611a8fcce6fdd1f8848e459de23f369309554e109d9f8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:36 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d81638df49d455b99611a8fcce6fdd1f8848e459de23f369309554e109d9f8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:36 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d81638df49d455b99611a8fcce6fdd1f8848e459de23f369309554e109d9f8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:36 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d81638df49d455b99611a8fcce6fdd1f8848e459de23f369309554e109d9f8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:36 np0005539509 podman[78669]: 2025-11-29 06:18:36.449645041 +0000 UTC m=+0.239729130 container init 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef)
Nov 29 01:18:36 np0005539509 podman[78669]: 2025-11-29 06:18:36.457360682 +0000 UTC m=+0.247444771 container start 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Nov 29 01:18:36 np0005539509 podman[78669]: 2025-11-29 06:18:36.555555274 +0000 UTC m=+0.345639393 container attach 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Nov 29 01:18:37 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Nov 29 01:18:37 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Nov 29 01:18:37 np0005539509 ecstatic_thompson[78718]: {
Nov 29 01:18:37 np0005539509 ecstatic_thompson[78718]:    "f793b967-de22-4105-bb0d-c91464bf150f": {
Nov 29 01:18:37 np0005539509 ecstatic_thompson[78718]:        "ceph_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 01:18:37 np0005539509 ecstatic_thompson[78718]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Nov 29 01:18:37 np0005539509 ecstatic_thompson[78718]:        "osd_id": 0,
Nov 29 01:18:37 np0005539509 ecstatic_thompson[78718]:        "osd_uuid": "f793b967-de22-4105-bb0d-c91464bf150f",
Nov 29 01:18:37 np0005539509 ecstatic_thompson[78718]:        "type": "bluestore"
Nov 29 01:18:37 np0005539509 ecstatic_thompson[78718]:    }
Nov 29 01:18:37 np0005539509 ecstatic_thompson[78718]: }
Nov 29 01:18:37 np0005539509 systemd[1]: libpod-7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b.scope: Deactivated successfully.
Nov 29 01:18:37 np0005539509 podman[78669]: 2025-11-29 06:18:37.357205132 +0000 UTC m=+1.147289251 container died 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:18:37 np0005539509 systemd[1]: var-lib-containers-storage-overlay-49d81638df49d455b99611a8fcce6fdd1f8848e459de23f369309554e109d9f8-merged.mount: Deactivated successfully.
Nov 29 01:18:37 np0005539509 podman[78669]: 2025-11-29 06:18:37.41963113 +0000 UTC m=+1.209715209 container remove 7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ecstatic_thompson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:18:37 np0005539509 systemd[1]: libpod-conmon-7ec1722bb71a509abf69a395b3f6f1d672a92926a43556eddbc6d088f3004c1b.scope: Deactivated successfully.
Nov 29 01:18:37 np0005539509 ceph-osd[78089]: osd.0 0 done with init, starting boot process
Nov 29 01:18:37 np0005539509 ceph-osd[78089]: osd.0 0 start_boot
Nov 29 01:18:37 np0005539509 ceph-osd[78089]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Nov 29 01:18:37 np0005539509 ceph-osd[78089]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Nov 29 01:18:37 np0005539509 ceph-osd[78089]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Nov 29 01:18:37 np0005539509 ceph-osd[78089]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Nov 29 01:18:37 np0005539509 ceph-osd[78089]: osd.0 0  bench count 12288000 bsize 4 KiB
Nov 29 01:18:38 np0005539509 podman[78973]: 2025-11-29 06:18:38.968398794 +0000 UTC m=+0.085767057 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Nov 29 01:18:39 np0005539509 podman[78973]: 2025-11-29 06:18:39.157876506 +0000 UTC m=+0.275244749 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:18:41 np0005539509 podman[79291]: 2025-11-29 06:18:41.162672336 +0000 UTC m=+0.098177123 container create 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:18:41 np0005539509 podman[79291]: 2025-11-29 06:18:41.106746403 +0000 UTC m=+0.042251240 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:41 np0005539509 systemd[1]: Started libpod-conmon-68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6.scope.
Nov 29 01:18:41 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:41 np0005539509 podman[79291]: 2025-11-29 06:18:41.304857217 +0000 UTC m=+0.240361974 container init 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:18:41 np0005539509 podman[79291]: 2025-11-29 06:18:41.317155589 +0000 UTC m=+0.252660376 container start 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Nov 29 01:18:41 np0005539509 pedantic_golick[79307]: 167 167
Nov 29 01:18:41 np0005539509 systemd[1]: libpod-68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6.scope: Deactivated successfully.
Nov 29 01:18:41 np0005539509 podman[79291]: 2025-11-29 06:18:41.343673469 +0000 UTC m=+0.279178216 container attach 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:18:41 np0005539509 podman[79291]: 2025-11-29 06:18:41.344245392 +0000 UTC m=+0.279750169 container died 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Nov 29 01:18:41 np0005539509 systemd[1]: var-lib-containers-storage-overlay-63bbdbab1fec8dc963214bf1ba81c5684776540ca9f8caafa8f344b10310b0e3-merged.mount: Deactivated successfully.
Nov 29 01:18:41 np0005539509 podman[79291]: 2025-11-29 06:18:41.512655655 +0000 UTC m=+0.448160442 container remove 68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pedantic_golick, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:18:41 np0005539509 systemd[1]: libpod-conmon-68e527bc6a0681daa1f96e1df12b7dae5e9538e8502b7ddfc57af4068737b0e6.scope: Deactivated successfully.
Nov 29 01:18:41 np0005539509 podman[79331]: 2025-11-29 06:18:41.761456006 +0000 UTC m=+0.090668997 container create 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:18:41 np0005539509 podman[79331]: 2025-11-29 06:18:41.709141812 +0000 UTC m=+0.038354873 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:18:41 np0005539509 systemd[1]: Started libpod-conmon-4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf.scope.
Nov 29 01:18:41 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:18:41 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36ffeb5645fb884e536821a50008b21babfed81117b73b78108d744dd5709a36/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:41 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36ffeb5645fb884e536821a50008b21babfed81117b73b78108d744dd5709a36/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:41 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36ffeb5645fb884e536821a50008b21babfed81117b73b78108d744dd5709a36/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:41 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36ffeb5645fb884e536821a50008b21babfed81117b73b78108d744dd5709a36/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:18:41 np0005539509 podman[79331]: 2025-11-29 06:18:41.913687558 +0000 UTC m=+0.242900639 container init 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Nov 29 01:18:41 np0005539509 podman[79331]: 2025-11-29 06:18:41.922530835 +0000 UTC m=+0.251743826 container start 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 01:18:41 np0005539509 podman[79331]: 2025-11-29 06:18:41.947725165 +0000 UTC m=+0.276938166 container attach 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:18:43 np0005539509 admiring_wing[79347]: [
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:    {
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:        "available": false,
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:        "ceph_device": false,
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:        "device_id": "QEMU_DVD-ROM_QM00001",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:        "lsm_data": {},
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:        "lvs": [],
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:        "path": "/dev/sr0",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:        "rejected_reasons": [
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "Has a FileSystem",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "Insufficient space (<5GB)"
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:        ],
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:        "sys_api": {
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "actuators": null,
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "device_nodes": "sr0",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "devname": "sr0",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "human_readable_size": "482.00 KB",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "id_bus": "ata",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "model": "QEMU DVD-ROM",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "nr_requests": "2",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "parent": "/dev/sr0",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "partitions": {},
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "path": "/dev/sr0",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "removable": "1",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "rev": "2.5+",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "ro": "0",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "rotational": "1",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "sas_address": "",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "sas_device_handle": "",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "scheduler_mode": "mq-deadline",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "sectors": 0,
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "sectorsize": "2048",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "size": 493568.0,
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "support_discard": "2048",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "type": "disk",
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:            "vendor": "QEMU"
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:        }
Nov 29 01:18:43 np0005539509 admiring_wing[79347]:    }
Nov 29 01:18:43 np0005539509 admiring_wing[79347]: ]
Nov 29 01:18:43 np0005539509 systemd[1]: libpod-4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf.scope: Deactivated successfully.
Nov 29 01:18:43 np0005539509 podman[79331]: 2025-11-29 06:18:43.271902737 +0000 UTC m=+1.601115718 container died 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Nov 29 01:18:43 np0005539509 systemd[1]: libpod-4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf.scope: Consumed 1.353s CPU time.
Nov 29 01:18:43 np0005539509 systemd[1]: var-lib-containers-storage-overlay-36ffeb5645fb884e536821a50008b21babfed81117b73b78108d744dd5709a36-merged.mount: Deactivated successfully.
Nov 29 01:18:43 np0005539509 podman[79331]: 2025-11-29 06:18:43.476395812 +0000 UTC m=+1.805608793 container remove 4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=admiring_wing, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Nov 29 01:18:43 np0005539509 systemd[1]: libpod-conmon-4dbd50c3c3434e9f643208075705756ef4c356d4cbf38bf79a7090a9f2d1f2cf.scope: Deactivated successfully.
Nov 29 01:18:43 np0005539509 ceph-osd[78089]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 11.852 iops: 3033.996 elapsed_sec: 0.989
Nov 29 01:18:43 np0005539509 ceph-osd[78089]: log_channel(cluster) log [WRN] : OSD bench result of 3033.995593 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 29 01:18:43 np0005539509 ceph-osd[78089]: osd.0 0 waiting for initial osdmap
Nov 29 01:18:43 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0[78085]: 2025-11-29T06:18:43.486+0000 7ff68c91e640 -1 osd.0 0 waiting for initial osdmap
Nov 29 01:18:43 np0005539509 ceph-osd[78089]: osd.0 7 crush map has features 288514050185494528, adjusting msgr requires for clients
Nov 29 01:18:43 np0005539509 ceph-osd[78089]: osd.0 7 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Nov 29 01:18:43 np0005539509 ceph-osd[78089]: osd.0 7 crush map has features 3314932999778484224, adjusting msgr requires for osds
Nov 29 01:18:43 np0005539509 ceph-osd[78089]: osd.0 7 check_osdmap_features require_osd_release unknown -> reef
Nov 29 01:18:43 np0005539509 ceph-osd[78089]: osd.0 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 29 01:18:43 np0005539509 ceph-osd[78089]: osd.0 7 set_numa_affinity not setting numa affinity
Nov 29 01:18:43 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-0[78085]: 2025-11-29T06:18:43.659+0000 7ff687f46640 -1 osd.0 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 29 01:18:43 np0005539509 ceph-osd[78089]: osd.0 7 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Nov 29 01:18:44 np0005539509 ceph-osd[78089]: osd.0 7 tick checking mon for new map
Nov 29 01:18:46 np0005539509 ceph-osd[78089]: osd.0 8 state: booting -> active
Nov 29 01:18:51 np0005539509 ceph-osd[78089]: osd.0 11 crush map has features 288514051259236352, adjusting msgr requires for clients
Nov 29 01:18:51 np0005539509 ceph-osd[78089]: osd.0 11 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Nov 29 01:18:51 np0005539509 ceph-osd[78089]: osd.0 11 crush map has features 3314933000852226048, adjusting msgr requires for osds
Nov 29 01:18:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 10 pg[1.0( empty local-lis/les=0/0 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=10) [0] r=0 lpr=10 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 12 pg[2.0( empty local-lis/les=0/0 n=0 ec=12/12 lis/c=0/0 les/c/f=0/0/0 sis=12) [0] r=0 lpr=12 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 12 pg[1.0( empty local-lis/les=10/12 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=10) [0] r=0 lpr=10 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:18:53 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 13 pg[2.0( empty local-lis/les=12/13 n=0 ec=12/12 lis/c=0/0 les/c/f=0/0/0 sis=12) [0] r=0 lpr=12 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 17 pg[2.0( empty local-lis/les=12/13 n=0 ec=12/12 lis/c=12/12 les/c/f=13/13/0 sis=17 pruub=10.059963226s) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active pruub 33.613990784s@ mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.0( empty local-lis/les=12/13 n=0 ec=12/12 lis/c=12/12 les/c/f=13/13/0 sis=17 pruub=10.059963226s) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown pruub 33.613990784s@ mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.5( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.4( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.2( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.8( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.3( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.9( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.6( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.7( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.b( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.a( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.c( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.d( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.e( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.f( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.10( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.11( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.12( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1a( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.13( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.14( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1b( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.16( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.15( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.17( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1e( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1f( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1c( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.1d( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.18( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:18:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 18 pg[2.19( empty local-lis/les=12/13 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.a( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.9( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.7( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.8( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.5( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.6( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.4( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.0( empty local-lis/les=17/19 n=0 ec=12/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.3( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.2( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.10( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.13( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.11( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.15( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.14( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.16( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.12( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.17( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.18( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1a( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.19( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:00 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 19 pg[2.1b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=12/12 les/c/f=13/13/0 sis=17) [0] r=0 lpr=17 pi=[12,17)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:01 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1 deep-scrub starts
Nov 29 01:19:01 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1 deep-scrub ok
Nov 29 01:19:02 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Nov 29 01:19:02 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Nov 29 01:19:04 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Nov 29 01:19:04 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Nov 29 01:19:04 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 21 pg[7.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:19:05 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Nov 29 01:19:05 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Nov 29 01:19:05 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 22 pg[7.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:19:06 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Nov 29 01:19:06 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Nov 29 01:19:07 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.6 deep-scrub starts
Nov 29 01:19:07 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.6 deep-scrub ok
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.235153198s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.618747711s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238571167s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622200012s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.235013008s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.618747711s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.a( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238540649s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622329712s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238471031s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622200012s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.a( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238486290s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622329712s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.6( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238062859s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622528076s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.6( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.238033295s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622528076s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.4( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237872124s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622528076s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.4( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237847328s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622528076s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237704277s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622562408s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237682343s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622562408s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237563133s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622646332s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237541199s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622646332s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237453461s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622661591s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237430573s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622661591s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237401009s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622741699s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.9( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237115860s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622425079s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.e( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237380028s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622741699s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.10( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237191200s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622676849s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.13( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237181664s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622734070s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.10( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237103462s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622676849s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.15( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237176895s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622810364s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.13( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237123489s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622734070s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.15( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.237146378s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622810364s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.9( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.236943245s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622425079s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.19( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.236249924s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622917175s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.236279488s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 40.622924805s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.19( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.236173630s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622917175s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:08 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 24 pg[2.1b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=24 pruub=8.236118317s) [1] r=-1 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 40.622924805s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:11 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Nov 29 01:19:11 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Nov 29 01:19:12 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.b scrub starts
Nov 29 01:19:12 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.b scrub ok
Nov 29 01:19:18 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.f deep-scrub starts
Nov 29 01:19:18 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.f deep-scrub ok
Nov 29 01:19:19 np0005539509 podman[80521]: 2025-11-29 06:19:19.359686699 +0000 UTC m=+0.067614163 container create ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:19:19 np0005539509 podman[80521]: 2025-11-29 06:19:19.317525072 +0000 UTC m=+0.025452516 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:19 np0005539509 systemd[1]: Started libpod-conmon-ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4.scope.
Nov 29 01:19:19 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:19:19 np0005539509 podman[80521]: 2025-11-29 06:19:19.499093687 +0000 UTC m=+0.207021211 container init ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Nov 29 01:19:19 np0005539509 podman[80521]: 2025-11-29 06:19:19.511517674 +0000 UTC m=+0.219445138 container start ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:19:19 np0005539509 flamboyant_bardeen[80538]: 167 167
Nov 29 01:19:19 np0005539509 systemd[1]: libpod-ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4.scope: Deactivated successfully.
Nov 29 01:19:19 np0005539509 podman[80521]: 2025-11-29 06:19:19.533347909 +0000 UTC m=+0.241275333 container attach ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:19 np0005539509 podman[80521]: 2025-11-29 06:19:19.534371272 +0000 UTC m=+0.242298696 container died ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:19 np0005539509 systemd[1]: var-lib-containers-storage-overlay-f8d6376ba1ffb61aae378cdd04af6587cedb52dad915bac4e0d41fffce32b962-merged.mount: Deactivated successfully.
Nov 29 01:19:19 np0005539509 podman[80521]: 2025-11-29 06:19:19.837244144 +0000 UTC m=+0.545171608 container remove ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_bardeen, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:19:19 np0005539509 systemd[1]: libpod-conmon-ae15ef7d8f76bb54df55d491c18040e0719e9d7637ad9e638089ee0b883a57d4.scope: Deactivated successfully.
Nov 29 01:19:20 np0005539509 podman[80558]: 2025-11-29 06:19:20.006343193 +0000 UTC m=+0.123754573 container create 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:20 np0005539509 podman[80558]: 2025-11-29 06:19:19.924315089 +0000 UTC m=+0.041726489 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:20 np0005539509 systemd[1]: Started libpod-conmon-318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819.scope.
Nov 29 01:19:20 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:19:20 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cc792b2a68caa0e6d619839f761005b4a61dd5abfcc89b0469664e74e7b305d/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:20 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cc792b2a68caa0e6d619839f761005b4a61dd5abfcc89b0469664e74e7b305d/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:20 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cc792b2a68caa0e6d619839f761005b4a61dd5abfcc89b0469664e74e7b305d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:20 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cc792b2a68caa0e6d619839f761005b4a61dd5abfcc89b0469664e74e7b305d/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:20 np0005539509 podman[80558]: 2025-11-29 06:19:20.079411677 +0000 UTC m=+0.196823077 container init 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:20 np0005539509 podman[80558]: 2025-11-29 06:19:20.089365368 +0000 UTC m=+0.206776748 container start 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 01:19:20 np0005539509 podman[80558]: 2025-11-29 06:19:20.09623029 +0000 UTC m=+0.213641690 container attach 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Nov 29 01:19:20 np0005539509 systemd[1]: libpod-318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819.scope: Deactivated successfully.
Nov 29 01:19:20 np0005539509 podman[80558]: 2025-11-29 06:19:20.468713879 +0000 UTC m=+0.586125299 container died 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Nov 29 01:19:20 np0005539509 systemd[1]: var-lib-containers-storage-overlay-9cc792b2a68caa0e6d619839f761005b4a61dd5abfcc89b0469664e74e7b305d-merged.mount: Deactivated successfully.
Nov 29 01:19:20 np0005539509 podman[80558]: 2025-11-29 06:19:20.579477821 +0000 UTC m=+0.696889211 container remove 318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_mclaren, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:20 np0005539509 systemd[1]: libpod-conmon-318db8c9cdf82c718a3d748869644a8a0478b2ec1831a1f5cfd572851331c819.scope: Deactivated successfully.
Nov 29 01:19:20 np0005539509 systemd[1]: Reloading.
Nov 29 01:19:20 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:20 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:21 np0005539509 systemd[1]: Reloading.
Nov 29 01:19:21 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:21 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:21 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Nov 29 01:19:21 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Nov 29 01:19:21 np0005539509 systemd[1]: Starting Ceph mon.compute-1 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:19:21 np0005539509 podman[80734]: 2025-11-29 06:19:21.646084309 +0000 UTC m=+0.074928517 container create 6c6562254e3e4a8763ba2492de731371242690cebdb554011375bcced4c68e5c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:21 np0005539509 podman[80734]: 2025-11-29 06:19:21.600353442 +0000 UTC m=+0.029197640 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:21 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c88a17628323222b9dfc9d24fc0d50145acb7a393baa442577f5e0ba9ac7f73/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:21 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c88a17628323222b9dfc9d24fc0d50145acb7a393baa442577f5e0ba9ac7f73/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:21 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c88a17628323222b9dfc9d24fc0d50145acb7a393baa442577f5e0ba9ac7f73/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:21 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c88a17628323222b9dfc9d24fc0d50145acb7a393baa442577f5e0ba9ac7f73/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:21 np0005539509 podman[80734]: 2025-11-29 06:19:21.724093913 +0000 UTC m=+0.152938071 container init 6c6562254e3e4a8763ba2492de731371242690cebdb554011375bcced4c68e5c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-1, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Nov 29 01:19:21 np0005539509 podman[80734]: 2025-11-29 06:19:21.728779357 +0000 UTC m=+0.157623515 container start 6c6562254e3e4a8763ba2492de731371242690cebdb554011375bcced4c68e5c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-1, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:19:21 np0005539509 bash[80734]: 6c6562254e3e4a8763ba2492de731371242690cebdb554011375bcced4c68e5c
Nov 29 01:19:21 np0005539509 systemd[1]: Started Ceph mon.compute-1 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: pidfile_write: ignore empty --pid-file
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: load: jerasure load: lrc 
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: RocksDB version: 7.9.2
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Git sha 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: DB SUMMARY
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: DB Session ID:  5Q1WIIQG9BN5XI35108Y
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: CURRENT file:  CURRENT
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 511 ; 
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                         Options.error_if_exists: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                       Options.create_if_missing: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                                     Options.env: 0x562154236c40
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                                      Options.fs: PosixFileSystem
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                                Options.info_log: 0x562155f78fc0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                              Options.statistics: (nil)
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                               Options.use_fsync: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                              Options.db_log_dir: 
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                                 Options.wal_dir: 
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                    Options.write_buffer_manager: 0x562155f88b40
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                  Options.unordered_write: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                               Options.row_cache: None
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                              Options.wal_filter: None
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.two_write_queues: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.wal_compression: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.atomic_flush: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.max_background_jobs: 2
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.max_background_compactions: -1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.max_subcompactions: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.max_total_wal_size: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                          Options.max_open_files: -1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:       Options.compaction_readahead_size: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Compression algorithms supported:
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: #011kZSTD supported: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: #011kXpressCompression supported: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: #011kBZip2Compression supported: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: #011kLZ4Compression supported: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: #011kZlibCompression supported: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: #011kLZ4HCCompression supported: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: #011kSnappyCompression supported: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:           Options.merge_operator: 
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562155f78c00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x562155f711f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:        Options.write_buffer_size: 33554432
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:  Options.max_write_buffer_number: 2
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:          Options.compression: NoCompression
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397161777911, "job": 1, "event": "recovery_started", "wal_files": [4]}
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397161885594, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397161885807, "job": 1, "event": "recovery_finished"}
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x562155f9ae00
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: DB pointer 0x5621560a2000
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562155f711f0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 01:19:21 np0005539509 ceph-mon[80754]: mon.compute-1@-1(???) e0 preinit fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).mds e1 new map
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 1 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 1 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e28 e28: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e29 e29: 2 total, 2 up, 2 in
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e29 crush map has features 3314933000852226048, adjusting msgr requires
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/577122409' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/577122409' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/1457732535' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/1457732535' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/2491487437' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/2491487437' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/2900095816' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/2900095816' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/956031255' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/956031255' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/2774593808' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Health check update: 6 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/2774593808' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/3785446785' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/3785446785' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/3924631149' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/3924631149' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Updating compute-2:/etc/ceph/ceph.conf
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/935132046' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/935132046' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/1714792720' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/1714792720' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Deploying daemon mon.compute-2 on compute-2
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Health check cleared: CEPHADM_REFRESH_FAILED (was: failed to probe daemons or devices)
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/2338482810' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: Health check update: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/2338482810' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Nov 29 01:19:22 np0005539509 ceph-mon[80754]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Nov 29 01:19:22 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Nov 29 01:19:22 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Nov 29 01:19:25 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Nov 29 01:19:25 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Nov 29 01:19:26 np0005539509 ceph-mon[80754]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Nov 29 01:19:26 np0005539509 ceph-mon[80754]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Nov 29 01:19:26 np0005539509 ceph-mon[80754]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 29 01:19:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:19:27 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Nov 29 01:19:27 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Nov 29 01:19:28 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Nov 29 01:19:28 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: Deploying daemon mon.compute-1 on compute-1
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: overall HEALTH_OK
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2025-11-29T06:19:20.235344Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,os=Linux}
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: Deploying daemon mgr.compute-2.ngsyhe on compute-2
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/501439537' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-1 calling monitor election
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: overall HEALTH_OK
Nov 29 01:19:29 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/501439537' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Nov 29 01:19:30 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Nov 29 01:19:30 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Nov 29 01:19:30 np0005539509 podman[80934]: 2025-11-29 06:19:30.367056762 +0000 UTC m=+0.054811536 container create ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Nov 29 01:19:30 np0005539509 systemd[72642]: Starting Mark boot as successful...
Nov 29 01:19:30 np0005539509 systemd[1]: Started libpod-conmon-ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17.scope.
Nov 29 01:19:30 np0005539509 systemd[72642]: Finished Mark boot as successful.
Nov 29 01:19:30 np0005539509 podman[80934]: 2025-11-29 06:19:30.342121769 +0000 UTC m=+0.029876553 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:30 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:19:30 np0005539509 podman[80934]: 2025-11-29 06:19:30.460239555 +0000 UTC m=+0.147994329 container init ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:30 np0005539509 podman[80934]: 2025-11-29 06:19:30.473560336 +0000 UTC m=+0.161315110 container start ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507)
Nov 29 01:19:30 np0005539509 podman[80934]: 2025-11-29 06:19:30.478437351 +0000 UTC m=+0.166192115 container attach ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:30 np0005539509 goofy_mcclintock[80951]: 167 167
Nov 29 01:19:30 np0005539509 systemd[1]: libpod-ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17.scope: Deactivated successfully.
Nov 29 01:19:30 np0005539509 podman[80934]: 2025-11-29 06:19:30.483724869 +0000 UTC m=+0.171479643 container died ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:19:30 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:30 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:30 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:30 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:30 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gaxpay", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 29 01:19:30 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gaxpay", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 29 01:19:30 np0005539509 ceph-mon[80754]: Deploying daemon mgr.compute-1.gaxpay on compute-1
Nov 29 01:19:30 np0005539509 systemd[1]: var-lib-containers-storage-overlay-bc21a13aab8904e76099b5019bb20a83d33371e7a6924527526aec83b2b0cadb-merged.mount: Deactivated successfully.
Nov 29 01:19:30 np0005539509 podman[80934]: 2025-11-29 06:19:30.539262213 +0000 UTC m=+0.227016987 container remove ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=goofy_mcclintock, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Nov 29 01:19:30 np0005539509 systemd[1]: libpod-conmon-ca85b3bd4eef0567fe43d833fb2d7f652bd5ad81e6467f23b6114dc1dd04aa17.scope: Deactivated successfully.
Nov 29 01:19:30 np0005539509 systemd[1]: Reloading.
Nov 29 01:19:30 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:30 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:30 np0005539509 systemd[1]: Reloading.
Nov 29 01:19:31 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:31 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:31 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1a deep-scrub starts
Nov 29 01:19:31 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1a deep-scrub ok
Nov 29 01:19:31 np0005539509 systemd[1]: Starting Ceph mgr.compute-1.gaxpay for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:19:31 np0005539509 podman[81097]: 2025-11-29 06:19:31.522695665 +0000 UTC m=+0.058182730 container create a8b9f68ee8f2adfa4e2e87b3aa53b3fca9f9c47317ab426c07bfc0ef9f58e64c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Nov 29 01:19:31 np0005539509 podman[81097]: 2025-11-29 06:19:31.491901288 +0000 UTC m=+0.027388403 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:31 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a5a650bd8be7eb3991a168818abad049a3e29f9b126c9e3b83ec22bdd71fbe1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:31 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a5a650bd8be7eb3991a168818abad049a3e29f9b126c9e3b83ec22bdd71fbe1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:31 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a5a650bd8be7eb3991a168818abad049a3e29f9b126c9e3b83ec22bdd71fbe1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:31 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a5a650bd8be7eb3991a168818abad049a3e29f9b126c9e3b83ec22bdd71fbe1/merged/var/lib/ceph/mgr/ceph-compute-1.gaxpay supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:31 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/2714267067' entity='client.admin' 
Nov 29 01:19:31 np0005539509 podman[81097]: 2025-11-29 06:19:31.611064573 +0000 UTC m=+0.146551688 container init a8b9f68ee8f2adfa4e2e87b3aa53b3fca9f9c47317ab426c07bfc0ef9f58e64c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:19:31 np0005539509 podman[81097]: 2025-11-29 06:19:31.619481828 +0000 UTC m=+0.154968873 container start a8b9f68ee8f2adfa4e2e87b3aa53b3fca9f9c47317ab426c07bfc0ef9f58e64c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:31 np0005539509 bash[81097]: a8b9f68ee8f2adfa4e2e87b3aa53b3fca9f9c47317ab426c07bfc0ef9f58e64c
Nov 29 01:19:31 np0005539509 systemd[1]: Started Ceph mgr.compute-1.gaxpay for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:19:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:31 np0005539509 ceph-mgr[81116]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:19:31 np0005539509 ceph-mgr[81116]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Nov 29 01:19:31 np0005539509 ceph-mgr[81116]: pidfile_write: ignore empty --pid-file
Nov 29 01:19:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:31 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'alerts'
Nov 29 01:19:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e29 _set_new_cache_sizes cache_size:1019927404 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:19:32 np0005539509 ceph-mgr[81116]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 29 01:19:32 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'balancer'
Nov 29 01:19:32 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:32.259+0000 7f8d11e36140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 29 01:19:32 np0005539509 ceph-mgr[81116]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 29 01:19:32 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'cephadm'
Nov 29 01:19:32 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:32.603+0000 7f8d11e36140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 29 01:19:32 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539509 ceph-mon[80754]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Nov 29 01:19:32 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539509 ceph-mon[80754]: Saving service ingress.rgw.default spec with placement count:2
Nov 29 01:19:32 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 01:19:32 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Nov 29 01:19:32 np0005539509 ceph-mon[80754]: Deploying daemon crash.compute-2 on compute-2
Nov 29 01:19:34 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'crash'
Nov 29 01:19:35 np0005539509 podman[81290]: 2025-11-29 06:19:35.000949576 +0000 UTC m=+0.045536127 container create 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True)
Nov 29 01:19:35 np0005539509 ceph-mgr[81116]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 29 01:19:35 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'dashboard'
Nov 29 01:19:35 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:35.013+0000 7f8d11e36140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 29 01:19:35 np0005539509 systemd[1]: Started libpod-conmon-675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3.scope.
Nov 29 01:19:35 np0005539509 podman[81290]: 2025-11-29 06:19:34.981242158 +0000 UTC m=+0.025828779 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:35 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:19:35 np0005539509 podman[81290]: 2025-11-29 06:19:35.110543945 +0000 UTC m=+0.155130496 container init 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2)
Nov 29 01:19:35 np0005539509 podman[81290]: 2025-11-29 06:19:35.121650754 +0000 UTC m=+0.166237315 container start 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Nov 29 01:19:35 np0005539509 podman[81290]: 2025-11-29 06:19:35.128402722 +0000 UTC m=+0.172989303 container attach 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:35 np0005539509 stoic_volhard[81306]: 167 167
Nov 29 01:19:35 np0005539509 systemd[1]: libpod-675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3.scope: Deactivated successfully.
Nov 29 01:19:35 np0005539509 podman[81290]: 2025-11-29 06:19:35.135025827 +0000 UTC m=+0.179612368 container died 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:19:35 np0005539509 systemd[1]: var-lib-containers-storage-overlay-7c965bbd6278b0896e99215cf57fa69530bd7c85476f7d3218ed7e7c40963309-merged.mount: Deactivated successfully.
Nov 29 01:19:35 np0005539509 podman[81290]: 2025-11-29 06:19:35.183177636 +0000 UTC m=+0.227764157 container remove 675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_volhard, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Nov 29 01:19:35 np0005539509 systemd[1]: libpod-conmon-675e052d23ceebf10b4b218086beb7c6206a60e5aabd9eaef2d2b32c898bc6e3.scope: Deactivated successfully.
Nov 29 01:19:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:19:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:19:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:19:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:35 np0005539509 podman[81331]: 2025-11-29 06:19:35.388372905 +0000 UTC m=+0.078094603 container create ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Nov 29 01:19:35 np0005539509 systemd[1]: Started libpod-conmon-ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0.scope.
Nov 29 01:19:35 np0005539509 podman[81331]: 2025-11-29 06:19:35.356333144 +0000 UTC m=+0.046054892 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:35 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:19:35 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:35 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:35 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:35 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:35 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:35 np0005539509 podman[81331]: 2025-11-29 06:19:35.519189825 +0000 UTC m=+0.208911563 container init ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Nov 29 01:19:35 np0005539509 podman[81331]: 2025-11-29 06:19:35.535406146 +0000 UTC m=+0.225127824 container start ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:35 np0005539509 podman[81331]: 2025-11-29 06:19:35.539549912 +0000 UTC m=+0.229271580 container attach ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:19:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e2 new map
Nov 29 01:19:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e2 print_map#012e2#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:19:35.589013+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 
Nov 29 01:19:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e30 e30: 2 total, 2 up, 2 in
Nov 29 01:19:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Nov 29 01:19:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Nov 29 01:19:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Nov 29 01:19:36 np0005539509 ceph-mon[80754]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Nov 29 01:19:36 np0005539509 ceph-mon[80754]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Nov 29 01:19:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Nov 29 01:19:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:36 np0005539509 upbeat_ellis[81347]: --> passed data devices: 0 physical, 1 LVM
Nov 29 01:19:36 np0005539509 upbeat_ellis[81347]: --> relative data size: 1.0
Nov 29 01:19:36 np0005539509 upbeat_ellis[81347]: --> All data devices are unavailable
Nov 29 01:19:36 np0005539509 systemd[1]: libpod-ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0.scope: Deactivated successfully.
Nov 29 01:19:36 np0005539509 podman[81362]: 2025-11-29 06:19:36.417189189 +0000 UTC m=+0.026466027 container died ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Nov 29 01:19:36 np0005539509 systemd[1]: var-lib-containers-storage-overlay-1c6a0cd60ce55f3e8464b05abadb52bc05e482c98d3bbe543ed12de83a9ce5b2-merged.mount: Deactivated successfully.
Nov 29 01:19:36 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'devicehealth'
Nov 29 01:19:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e31 e31: 3 total, 2 up, 3 in
Nov 29 01:19:36 np0005539509 podman[81362]: 2025-11-29 06:19:36.568600202 +0000 UTC m=+0.177877020 container remove ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=upbeat_ellis, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:19:36 np0005539509 systemd[1]: libpod-conmon-ed11f7a86751631aef3053116e977a7ffd6796d2433a14c9223bf90b6d16f7f0.scope: Deactivated successfully.
Nov 29 01:19:36 np0005539509 ceph-mgr[81116]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 29 01:19:36 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'diskprediction_local'
Nov 29 01:19:36 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:36.819+0000 7f8d11e36140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 29 01:19:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020053102 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:19:37 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Nov 29 01:19:37 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Nov 29 01:19:37 np0005539509 podman[81519]: 2025-11-29 06:19:37.273435461 +0000 UTC m=+0.040187799 container create e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:19:37 np0005539509 systemd[1]: Started libpod-conmon-e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1.scope.
Nov 29 01:19:37 np0005539509 ceph-mon[80754]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Nov 29 01:19:37 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.102:0/2624547066' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]: dispatch
Nov 29 01:19:37 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]: dispatch
Nov 29 01:19:37 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]': finished
Nov 29 01:19:37 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:37 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:19:37 np0005539509 podman[81519]: 2025-11-29 06:19:37.257268742 +0000 UTC m=+0.024021110 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:37 np0005539509 podman[81519]: 2025-11-29 06:19:37.366271954 +0000 UTC m=+0.133024312 container init e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:37 np0005539509 podman[81519]: 2025-11-29 06:19:37.37871557 +0000 UTC m=+0.145467918 container start e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:37 np0005539509 podman[81519]: 2025-11-29 06:19:37.383071981 +0000 UTC m=+0.149824339 container attach e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2)
Nov 29 01:19:37 np0005539509 optimistic_wing[81535]: 167 167
Nov 29 01:19:37 np0005539509 systemd[1]: libpod-e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1.scope: Deactivated successfully.
Nov 29 01:19:37 np0005539509 podman[81519]: 2025-11-29 06:19:37.385724505 +0000 UTC m=+0.152476863 container died e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Nov 29 01:19:37 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 29 01:19:37 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 29 01:19:37 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]:  from numpy import show_config as show_numpy_config
Nov 29 01:19:37 np0005539509 systemd[1]: var-lib-containers-storage-overlay-57188317ae84602b08fa483a67ef1e66b83325c8218fd4118670772c88cd54e8-merged.mount: Deactivated successfully.
Nov 29 01:19:37 np0005539509 ceph-mgr[81116]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 29 01:19:37 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:37.418+0000 7f8d11e36140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 29 01:19:37 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'influx'
Nov 29 01:19:37 np0005539509 podman[81519]: 2025-11-29 06:19:37.436402495 +0000 UTC m=+0.203154853 container remove e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=optimistic_wing, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:37 np0005539509 systemd[1]: libpod-conmon-e9f5c7394765264b53ea3a14276a28545d48a7db03fa2060bc3ea6a2ac4031c1.scope: Deactivated successfully.
Nov 29 01:19:37 np0005539509 podman[81559]: 2025-11-29 06:19:37.61414869 +0000 UTC m=+0.049041945 container create 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:37 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:37.656+0000 7f8d11e36140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 29 01:19:37 np0005539509 ceph-mgr[81116]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 29 01:19:37 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'insights'
Nov 29 01:19:37 np0005539509 systemd[1]: Started libpod-conmon-8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5.scope.
Nov 29 01:19:37 np0005539509 podman[81559]: 2025-11-29 06:19:37.59471116 +0000 UTC m=+0.029604445 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:37 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:19:37 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb588a27fcacf5a0d04b5f02fb51c7b0b9ba2f101c539581f9956da9fc72d395/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:37 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb588a27fcacf5a0d04b5f02fb51c7b0b9ba2f101c539581f9956da9fc72d395/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:37 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb588a27fcacf5a0d04b5f02fb51c7b0b9ba2f101c539581f9956da9fc72d395/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:37 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb588a27fcacf5a0d04b5f02fb51c7b0b9ba2f101c539581f9956da9fc72d395/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:37 np0005539509 podman[81559]: 2025-11-29 06:19:37.713074973 +0000 UTC m=+0.147968258 container init 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Nov 29 01:19:37 np0005539509 podman[81559]: 2025-11-29 06:19:37.721385044 +0000 UTC m=+0.156278339 container start 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:19:37 np0005539509 podman[81559]: 2025-11-29 06:19:37.726343333 +0000 UTC m=+0.161236608 container attach 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Nov 29 01:19:37 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'iostat'
Nov 29 01:19:38 np0005539509 ceph-mgr[81116]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 29 01:19:38 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'k8sevents'
Nov 29 01:19:38 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:38.149+0000 7f8d11e36140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 29 01:19:38 np0005539509 romantic_carver[81575]: {
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:    "0": [
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:        {
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            "devices": [
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "/dev/loop3"
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            ],
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            "lv_name": "ceph_lv0",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            "lv_size": "7511998464",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=336ec58c-893b-528f-a0c1-6ed1196bc047,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=f793b967-de22-4105-bb0d-c91464bf150f,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            "lv_uuid": "caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            "name": "ceph_lv0",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            "path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            "tags": {
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.block_uuid": "caUO36-j5Sh-tny0-91ng-tXie-LviW-JsmAHB",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.cephx_lockbox_secret": "",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.cluster_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.cluster_name": "ceph",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.crush_device_class": "",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.encrypted": "0",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.osd_fsid": "f793b967-de22-4105-bb0d-c91464bf150f",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.osd_id": "0",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.osdspec_affinity": "default_drive_group",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.type": "block",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:                "ceph.vdo": "0"
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            },
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            "type": "block",
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:            "vg_name": "ceph_vg0"
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:        }
Nov 29 01:19:38 np0005539509 romantic_carver[81575]:    ]
Nov 29 01:19:38 np0005539509 romantic_carver[81575]: }
Nov 29 01:19:38 np0005539509 systemd[1]: libpod-8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5.scope: Deactivated successfully.
Nov 29 01:19:38 np0005539509 podman[81559]: 2025-11-29 06:19:38.456388583 +0000 UTC m=+0.891281858 container died 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:38 np0005539509 ceph-mon[80754]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Nov 29 01:19:38 np0005539509 systemd[1]: var-lib-containers-storage-overlay-bb588a27fcacf5a0d04b5f02fb51c7b0b9ba2f101c539581f9956da9fc72d395-merged.mount: Deactivated successfully.
Nov 29 01:19:38 np0005539509 podman[81559]: 2025-11-29 06:19:38.523427088 +0000 UTC m=+0.958320343 container remove 8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=romantic_carver, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:38 np0005539509 systemd[1]: libpod-conmon-8ab7ba323fd74e4b283d758133eec102fa80abfcba5a6932ef08237b7c096fb5.scope: Deactivated successfully.
Nov 29 01:19:39 np0005539509 podman[81735]: 2025-11-29 06:19:39.19956206 +0000 UTC m=+0.021736876 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:39 np0005539509 podman[81735]: 2025-11-29 06:19:39.311109903 +0000 UTC m=+0.133284699 container create ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Nov 29 01:19:39 np0005539509 systemd[1]: Started libpod-conmon-ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b.scope.
Nov 29 01:19:39 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:19:39 np0005539509 podman[81735]: 2025-11-29 06:19:39.414995874 +0000 UTC m=+0.237170700 container init ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Nov 29 01:19:39 np0005539509 podman[81735]: 2025-11-29 06:19:39.425886756 +0000 UTC m=+0.248061542 container start ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:19:39 np0005539509 podman[81735]: 2025-11-29 06:19:39.429066465 +0000 UTC m=+0.251241351 container attach ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:19:39 np0005539509 serene_diffie[81752]: 167 167
Nov 29 01:19:39 np0005539509 systemd[1]: libpod-ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b.scope: Deactivated successfully.
Nov 29 01:19:39 np0005539509 podman[81735]: 2025-11-29 06:19:39.433392156 +0000 UTC m=+0.255567012 container died ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:39 np0005539509 systemd[1]: var-lib-containers-storage-overlay-b783fb19fe463c7bd64a6b3c05d5b305fecd2a1637d4528f1aad4d4c1d01992f-merged.mount: Deactivated successfully.
Nov 29 01:19:39 np0005539509 podman[81735]: 2025-11-29 06:19:39.493220949 +0000 UTC m=+0.315395745 container remove ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_diffie, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:39 np0005539509 systemd[1]: libpod-conmon-ea9e6a0e30d0e7ed953eb8b7f478ed0f401d0d7a49d09db41e6685151379f40b.scope: Deactivated successfully.
Nov 29 01:19:39 np0005539509 podman[81775]: 2025-11-29 06:19:39.708594262 +0000 UTC m=+0.055861816 container create fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:39 np0005539509 systemd[1]: Started libpod-conmon-fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb.scope.
Nov 29 01:19:39 np0005539509 podman[81775]: 2025-11-29 06:19:39.678437263 +0000 UTC m=+0.025704877 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:39 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:19:39 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3c2e988dc22fe37205ec4a2e43ff5e9e1f139f8b01c91f53da8011e3eb7e282/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:39 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3c2e988dc22fe37205ec4a2e43ff5e9e1f139f8b01c91f53da8011e3eb7e282/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:39 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3c2e988dc22fe37205ec4a2e43ff5e9e1f139f8b01c91f53da8011e3eb7e282/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:39 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3c2e988dc22fe37205ec4a2e43ff5e9e1f139f8b01c91f53da8011e3eb7e282/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:39 np0005539509 podman[81775]: 2025-11-29 06:19:39.808394939 +0000 UTC m=+0.155662543 container init fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:39 np0005539509 podman[81775]: 2025-11-29 06:19:39.822632965 +0000 UTC m=+0.169900479 container start fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:19:39 np0005539509 podman[81775]: 2025-11-29 06:19:39.838377843 +0000 UTC m=+0.185645437 container attach fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:39 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'localpool'
Nov 29 01:19:39 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Nov 29 01:19:40 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Nov 29 01:19:40 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'mds_autoscaler'
Nov 29 01:19:40 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/713391435' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Nov 29 01:19:40 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/713391435' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Nov 29 01:19:40 np0005539509 zen_thompson[81792]: {
Nov 29 01:19:40 np0005539509 zen_thompson[81792]:    "f793b967-de22-4105-bb0d-c91464bf150f": {
Nov 29 01:19:40 np0005539509 zen_thompson[81792]:        "ceph_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 01:19:40 np0005539509 zen_thompson[81792]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Nov 29 01:19:40 np0005539509 zen_thompson[81792]:        "osd_id": 0,
Nov 29 01:19:40 np0005539509 zen_thompson[81792]:        "osd_uuid": "f793b967-de22-4105-bb0d-c91464bf150f",
Nov 29 01:19:40 np0005539509 zen_thompson[81792]:        "type": "bluestore"
Nov 29 01:19:40 np0005539509 zen_thompson[81792]:    }
Nov 29 01:19:40 np0005539509 zen_thompson[81792]: }
Nov 29 01:19:41 np0005539509 systemd[1]: libpod-fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb.scope: Deactivated successfully.
Nov 29 01:19:41 np0005539509 podman[81775]: 2025-11-29 06:19:41.066536492 +0000 UTC m=+1.413804026 container died fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:19:41 np0005539509 systemd[1]: libpod-fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb.scope: Consumed 1.227s CPU time.
Nov 29 01:19:41 np0005539509 systemd[1]: var-lib-containers-storage-overlay-a3c2e988dc22fe37205ec4a2e43ff5e9e1f139f8b01c91f53da8011e3eb7e282-merged.mount: Deactivated successfully.
Nov 29 01:19:41 np0005539509 podman[81775]: 2025-11-29 06:19:41.132675492 +0000 UTC m=+1.479943016 container remove fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:41 np0005539509 systemd[1]: libpod-conmon-fb245b9468239dfd44a66652581347f2bbf40dd1fef4618cb6337e1af3162bbb.scope: Deactivated successfully.
Nov 29 01:19:41 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'mirroring'
Nov 29 01:19:41 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'nfs'
Nov 29 01:19:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054711 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:19:42 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:42 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:42 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:42 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:42 np0005539509 ceph-mgr[81116]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 29 01:19:42 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'orchestrator'
Nov 29 01:19:42 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:42.252+0000 7f8d11e36140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 29 01:19:42 np0005539509 ceph-mgr[81116]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 29 01:19:42 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'osd_perf_query'
Nov 29 01:19:42 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:42.964+0000 7f8d11e36140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 29 01:19:43 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Nov 29 01:19:43 np0005539509 ceph-mon[80754]: Deploying daemon osd.2 on compute-2
Nov 29 01:19:43 np0005539509 ceph-mgr[81116]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 29 01:19:43 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'osd_support'
Nov 29 01:19:43 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:43.272+0000 7f8d11e36140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 29 01:19:43 np0005539509 ceph-mgr[81116]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 29 01:19:43 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:43.564+0000 7f8d11e36140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 29 01:19:43 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'pg_autoscaler'
Nov 29 01:19:43 np0005539509 ceph-mgr[81116]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 29 01:19:43 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'progress'
Nov 29 01:19:43 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:43.861+0000 7f8d11e36140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 29 01:19:44 np0005539509 ceph-mgr[81116]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 29 01:19:44 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'prometheus'
Nov 29 01:19:44 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:44.108+0000 7f8d11e36140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 29 01:19:45 np0005539509 ceph-mgr[81116]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 29 01:19:45 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'rbd_support'
Nov 29 01:19:45 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:45.140+0000 7f8d11e36140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 29 01:19:45 np0005539509 ceph-mgr[81116]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 29 01:19:45 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'restful'
Nov 29 01:19:45 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:45.438+0000 7f8d11e36140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 29 01:19:46 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'rgw'
Nov 29 01:19:46 np0005539509 ceph-mgr[81116]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 29 01:19:46 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'rook'
Nov 29 01:19:46 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:46.921+0000 7f8d11e36140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 29 01:19:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:19:47 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/2969688060' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Nov 29 01:19:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:49 np0005539509 ceph-mgr[81116]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 29 01:19:49 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'selftest'
Nov 29 01:19:49 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:49.119+0000 7f8d11e36140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 29 01:19:49 np0005539509 ceph-mgr[81116]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 29 01:19:49 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'snap_schedule'
Nov 29 01:19:49 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:49.360+0000 7f8d11e36140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 29 01:19:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e32 e32: 3 total, 2 up, 3 in
Nov 29 01:19:49 np0005539509 ceph-mon[80754]: from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 29 01:19:49 np0005539509 ceph-mon[80754]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 29 01:19:49 np0005539509 ceph-mgr[81116]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 29 01:19:49 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:49.609+0000 7f8d11e36140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 29 01:19:49 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'stats'
Nov 29 01:19:49 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'status'
Nov 29 01:19:50 np0005539509 ceph-mgr[81116]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 29 01:19:50 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'telegraf'
Nov 29 01:19:50 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:50.134+0000 7f8d11e36140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 29 01:19:50 np0005539509 ceph-mgr[81116]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 29 01:19:50 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'telemetry'
Nov 29 01:19:50 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:50.378+0000 7f8d11e36140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 29 01:19:50 np0005539509 ceph-mon[80754]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Nov 29 01:19:50 np0005539509 ceph-mon[80754]: from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 29 01:19:50 np0005539509 ceph-mon[80754]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 29 01:19:50 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:50 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:51 np0005539509 ceph-mgr[81116]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 29 01:19:51 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'test_orchestrator'
Nov 29 01:19:51 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:51.017+0000 7f8d11e36140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 29 01:19:51 np0005539509 podman[82051]: 2025-11-29 06:19:51.308914885 +0000 UTC m=+0.119982269 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Nov 29 01:19:51 np0005539509 podman[82051]: 2025-11-29 06:19:51.509043133 +0000 UTC m=+0.320110537 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:51 np0005539509 ceph-mgr[81116]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 29 01:19:51 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'volumes'
Nov 29 01:19:51 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:51.713+0000 7f8d11e36140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 29 01:19:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e33 e33: 3 total, 2 up, 3 in
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.1d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786844254s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623573303s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.1d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786844254s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623573303s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.1c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786513329s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623565674s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.1c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786513329s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623565674s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786314964s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623596191s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786314964s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623596191s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.5( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.785975456s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623611450s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.5( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.785975456s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623611450s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.785957336s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623748779s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.785957336s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623748779s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.12( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786064148s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.623954773s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.12( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786064148s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.623954773s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.18( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786125183s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 active pruub 88.624198914s@ mbc={}] start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:19:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 33 pg[2.18( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=33 pruub=12.786125183s) [] r=-1 lpr=33 pi=[17,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 88.624198914s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:19:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:19:52 np0005539509 ceph-mgr[81116]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 29 01:19:52 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:52.433+0000 7f8d11e36140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 29 01:19:52 np0005539509 ceph-mgr[81116]: mgr[py] Loading python module 'zabbix'
Nov 29 01:19:52 np0005539509 ceph-mgr[81116]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 29 01:19:52 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-1-gaxpay[81112]: 2025-11-29T06:19:52.680+0000 7f8d11e36140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 29 01:19:52 np0005539509 ceph-mgr[81116]: ms_deliver_dispatch: unhandled message 0x5649d912b1e0 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Nov 29 01:19:52 np0005539509 ceph-mgr[81116]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 01:19:53 np0005539509 ceph-mon[80754]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Nov 29 01:19:53 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:53 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:53 np0005539509 ceph-mgr[81116]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 01:19:54 np0005539509 ceph-mgr[81116]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 01:19:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e34 e34: 3 total, 2 up, 3 in
Nov 29 01:19:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e34 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:19:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:19:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:19:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:19:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e35 e35: 3 total, 2 up, 3 in
Nov 29 01:19:59 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:01 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e36 e36: 3 total, 2 up, 3 in
Nov 29 01:20:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:03 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:20:03 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:20:03 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:03 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:03 np0005539509 ceph-mon[80754]: Health detail: HEALTH_ERR 1 filesystem is offline; 1 filesystem is online with fewer MDS than max_mds
Nov 29 01:20:03 np0005539509 ceph-mon[80754]: [ERR] MDS_ALL_DOWN: 1 filesystem is offline
Nov 29 01:20:03 np0005539509 ceph-mon[80754]:    fs cephfs is offline because no MDS is active for it.
Nov 29 01:20:03 np0005539509 ceph-mon[80754]: [WRN] MDS_UP_LESS_THAN_MAX: 1 filesystem is online with fewer MDS than max_mds
Nov 29 01:20:03 np0005539509 ceph-mon[80754]:    fs cephfs has 0 MDS online, but wants 1
Nov 29 01:20:04 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e37 e37: 3 total, 2 up, 3 in
Nov 29 01:20:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:20:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Nov 29 01:20:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Nov 29 01:20:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:12 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) log_latency slow operation observed for kv_commit, latency = 7.038947105s
Nov 29 01:20:12 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) log_latency slow operation observed for kv_sync, latency = 7.038947105s
Nov 29 01:20:12 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) log_latency_fn slow operation observed for _txc_committed_kv, latency = 7.039439678s, txc = 0x5566c8a4b800
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e38 e38: 3 total, 2 up, 3 in
Nov 29 01:20:13 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) log_latency_fn slow operation observed for _txc_committed_kv, latency = 7.175556183s, txc = 0x5566c8a53b00
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).paxos(paxos updating c 1..407) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 2.359349251s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Nov 29 01:20:13 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-1[80750]: 2025-11-29T06:20:13.421+0000 7f7b25776640 -1 mon.compute-1@2(peon).paxos(paxos updating c 1..407) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 2.359349251s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: Adjusting osd_memory_target on compute-2 to 128.0M
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: Unable to set osd_memory_target on compute-2 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: Updating compute-0:/etc/ceph/ceph.conf
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: Updating compute-1:/etc/ceph/ceph.conf
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: Updating compute-2:/etc/ceph/ceph.conf
Nov 29 01:20:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e39 e39: 3 total, 2 up, 3 in
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: Updating compute-0:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: Updating compute-1:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: OSD bench result of 1381.921175 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 01:20:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e40 e40: 3 total, 3 up, 3 in
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.1d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.1d( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.b( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[7.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=40 pruub=15.883583069s) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active pruub 117.207687378s@ mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.1c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.5( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.1c( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.5( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.f( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.18( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.18( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.12( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[2.12( empty local-lis/les=17/19 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=-1 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:17 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 40 pg[7.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=40 pruub=15.883583069s) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown pruub 117.207687378s@ mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:27 np0005539509 ceph-mon[80754]: osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518] boot
Nov 29 01:20:28 np0005539509 systemd[1]: session-19.scope: Deactivated successfully.
Nov 29 01:20:28 np0005539509 systemd[1]: session-19.scope: Consumed 9.794s CPU time.
Nov 29 01:20:28 np0005539509 systemd-logind[785]: Session 19 logged out. Waiting for processes to exit.
Nov 29 01:20:28 np0005539509 systemd-logind[785]: Removed session 19.
Nov 29 01:20:28 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).paxos(paxos updating c 1..414) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 4.066800594s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Nov 29 01:20:28 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-1[80750]: 2025-11-29T06:20:28.616+0000 7f7b25776640 -1 mon.compute-1@2(peon).paxos(paxos updating c 1..414) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 4.066800594s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Nov 29 01:20:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1e( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1c( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1a( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.16( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.15( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.10( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.c( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.b( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.8( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.19( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.e( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.d( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.2( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.7( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.5( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.a( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.14( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.11( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.17( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.12( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1d( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=21/22 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.16( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.15( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.10( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.c( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1a( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.8( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1c( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.19( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.0( empty local-lis/les=40/41 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.2( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.7( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.5( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.a( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.14( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.d( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.11( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.17( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.12( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1d( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=21/21 les/c/f=22/22/0 sis=40) [0] r=0 lpr=40 pi=[21,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:31 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:31 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:31 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:31 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:20:34 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Nov 29 01:20:34 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Nov 29 01:20:34 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.18( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.13( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.c( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.e( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.d( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.a( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.8( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.a( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.5( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.2( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.1b( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.1a( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.d( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.e( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.3( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[4.5( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[6.7( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.709385872s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.802291870s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.709323883s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.802291870s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1d( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.709070206s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.802078247s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.709043503s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.802078247s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.11( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708967209s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.802047729s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1d( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.709013939s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.802078247s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708970070s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.802078247s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.11( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708914757s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.802047729s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.14( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708758354s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801986694s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.a( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708691597s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801956177s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.14( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708703041s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801986694s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.a( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708639145s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801956177s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708537102s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801925659s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.5( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708424568s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801849365s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708496094s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801925659s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.5( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708378792s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801849365s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.2( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708154678s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active+scrubbing pruub 129.801681519s@ [ 7.2:  ]  mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708103180s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801712036s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.2( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708094597s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801681519s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707938194s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801605225s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707839966s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801528931s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708559990s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.802261353s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707894325s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801605225s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707805634s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801528931s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708518028s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.802261353s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.708050728s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801712036s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707393646s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801391602s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707491875s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801498413s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.8( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707393646s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801437378s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.10( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707167625s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801223755s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707448006s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801498413s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707311630s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801391602s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707350731s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.801452637s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.10( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707128525s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801223755s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.8( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707340240s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801437378s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.707299232s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.801452637s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.16( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.431221962s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.525558472s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.430930138s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.525344849s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.430959702s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 129.525375366s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.16( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.431178093s) [2] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.525558472s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1e( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.430882454s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.525344849s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=40/41 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=10.430913925s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 129.525375366s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.16( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.10( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.1c( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.14( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.1c( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.1b( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.13( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.1f( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.5( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.d( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[3.10( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 42 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:36 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Nov 29 01:20:36 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Nov 29 01:20:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:20:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:20:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 01:20:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:20:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:20:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Nov 29 01:20:36 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.18( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:36 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.13( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.d( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.e( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.a( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.8( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.a( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.5( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.2( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.1b( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.d( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.1a( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.e( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.3( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.5( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[6.7( empty local-lis/les=42/43 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[4.c( empty local-lis/les=42/43 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [0] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.1f( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.16( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.11( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.15( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.13( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.10( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.10( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.16( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.7( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.3( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.2( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.a( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.f( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.1b( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.1c( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.5( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.1( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.c( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.d( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.9( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.18( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.14( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[5.1c( empty local-lis/les=42/43 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:37 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 43 pg[3.f( empty local-lis/les=42/43 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42) [0] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Nov 29 01:20:39 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.c scrub starts
Nov 29 01:20:39 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.c scrub ok
Nov 29 01:20:39 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:41 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:41 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:41 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.pkypgd", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 01:20:42 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.pkypgd", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 01:20:42 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:42 np0005539509 ceph-mon[80754]: Deploying daemon rgw.rgw.compute-2.pkypgd on compute-2
Nov 29 01:20:44 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:45 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.d scrub starts
Nov 29 01:20:45 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.d scrub ok
Nov 29 01:20:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Nov 29 01:20:47 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.12 scrub starts
Nov 29 01:20:47 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.12 scrub ok
Nov 29 01:20:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Nov 29 01:20:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:47 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 29 01:20:47 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 29 01:20:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:48 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Nov 29 01:20:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.cbugbv", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 01:20:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.cbugbv", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 01:20:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:48 np0005539509 ceph-mon[80754]: Deploying daemon rgw.rgw.compute-1.cbugbv on compute-1
Nov 29 01:20:48 np0005539509 podman[83262]: 2025-11-29 06:20:48.904253151 +0000 UTC m=+0.057739438 container create 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:20:48 np0005539509 systemd[1]: Started libpod-conmon-93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22.scope.
Nov 29 01:20:48 np0005539509 podman[83262]: 2025-11-29 06:20:48.876140674 +0000 UTC m=+0.029627151 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:20:48 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:20:49 np0005539509 podman[83262]: 2025-11-29 06:20:49.009237166 +0000 UTC m=+0.162723523 container init 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Nov 29 01:20:49 np0005539509 podman[83262]: 2025-11-29 06:20:49.019962348 +0000 UTC m=+0.173448655 container start 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True)
Nov 29 01:20:49 np0005539509 podman[83262]: 2025-11-29 06:20:49.023497825 +0000 UTC m=+0.176984202 container attach 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:20:49 np0005539509 zealous_keller[83278]: 167 167
Nov 29 01:20:49 np0005539509 systemd[1]: libpod-93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22.scope: Deactivated successfully.
Nov 29 01:20:49 np0005539509 podman[83262]: 2025-11-29 06:20:49.02877882 +0000 UTC m=+0.182265117 container died 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:20:49 np0005539509 systemd[1]: var-lib-containers-storage-overlay-09e2c3d39cb101d48c690e8edf1a2653ec2d1e147340ec5325528d76126c4f18-merged.mount: Deactivated successfully.
Nov 29 01:20:49 np0005539509 podman[83262]: 2025-11-29 06:20:49.06618452 +0000 UTC m=+0.219670787 container remove 93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_keller, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 29 01:20:49 np0005539509 systemd[1]: libpod-conmon-93c132c08f77c4763b87bde70bd3f69d81b23087f52b317c505a6fe2c1922b22.scope: Deactivated successfully.
Nov 29 01:20:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Nov 29 01:20:49 np0005539509 systemd[1]: Reloading.
Nov 29 01:20:49 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:20:49 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:20:49 np0005539509 systemd[1]: Reloading.
Nov 29 01:20:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:49 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:20:49 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:20:49 np0005539509 systemd[1]: Starting Ceph rgw.rgw.compute-1.cbugbv for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:20:50 np0005539509 podman[83423]: 2025-11-29 06:20:49.96794633 +0000 UTC m=+0.026980067 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:20:50 np0005539509 podman[83423]: 2025-11-29 06:20:50.317621833 +0000 UTC m=+0.376655580 container create cfaa1f3ba4a2ce0fe5c305ba0458f16f2702133085ca83fb09a233adc862d6bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-1-cbugbv, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:20:50 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ce5436a4d026d72b60789a4d51f19c1971e0eb434b95c7f850d53f2c2460a58/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:50 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ce5436a4d026d72b60789a4d51f19c1971e0eb434b95c7f850d53f2c2460a58/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:50 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ce5436a4d026d72b60789a4d51f19c1971e0eb434b95c7f850d53f2c2460a58/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:50 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ce5436a4d026d72b60789a4d51f19c1971e0eb434b95c7f850d53f2c2460a58/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.cbugbv supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:50 np0005539509 podman[83423]: 2025-11-29 06:20:50.49810303 +0000 UTC m=+0.557136757 container init cfaa1f3ba4a2ce0fe5c305ba0458f16f2702133085ca83fb09a233adc862d6bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-1-cbugbv, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True)
Nov 29 01:20:50 np0005539509 podman[83423]: 2025-11-29 06:20:50.512275786 +0000 UTC m=+0.571309493 container start cfaa1f3ba4a2ce0fe5c305ba0458f16f2702133085ca83fb09a233adc862d6bf (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-1-cbugbv, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Nov 29 01:20:50 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 29 01:20:50 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 29 01:20:50 np0005539509 bash[83423]: cfaa1f3ba4a2ce0fe5c305ba0458f16f2702133085ca83fb09a233adc862d6bf
Nov 29 01:20:50 np0005539509 systemd[1]: Started Ceph rgw.rgw.compute-1.cbugbv for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:20:50 np0005539509 radosgw[83442]: deferred set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:20:50 np0005539509 radosgw[83442]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Nov 29 01:20:50 np0005539509 radosgw[83442]: framework: beast
Nov 29 01:20:50 np0005539509 radosgw[83442]: framework conf key: endpoint, val: 192.168.122.101:8082
Nov 29 01:20:50 np0005539509 radosgw[83442]: init_numa not setting numa affinity
Nov 29 01:20:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Nov 29 01:20:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Nov 29 01:20:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Nov 29 01:20:51 np0005539509 ceph-mon[80754]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:20:51 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Nov 29 01:20:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vmptkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 01:20:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vmptkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 01:20:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Nov 29 01:20:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Nov 29 01:20:52 np0005539509 ceph-mon[80754]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1253186838' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 01:20:52 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.17 deep-scrub starts
Nov 29 01:20:52 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 49 pg[10.0( empty local-lis/les=0/0 n=0 ec=49/49 lis/c=0/0 les/c/f=0/0/0 sis=49) [0] r=0 lpr=49 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:52 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.17 deep-scrub ok
Nov 29 01:20:52 np0005539509 ceph-mon[80754]: Deploying daemon rgw.rgw.compute-0.vmptkp on compute-0
Nov 29 01:20:52 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 01:20:52 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 01:20:52 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.101:0/1253186838' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 01:20:52 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 01:20:53 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Nov 29 01:20:53 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 50 pg[10.0( empty local-lis/les=49/50 n=0 ec=49/49 lis/c=0/0 les/c/f=0/0/0 sis=49) [0] r=0 lpr=49 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:54 np0005539509 ceph-mon[80754]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Nov 29 01:20:54 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Nov 29 01:20:54 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Nov 29 01:20:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Nov 29 01:20:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Nov 29 01:20:54 np0005539509 ceph-mon[80754]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:56 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Nov 29 01:20:56 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Nov 29 01:20:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Nov 29 01:20:57 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:57 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:57 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:57 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Nov 29 01:20:57 np0005539509 ceph-mon[80754]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:20:58 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1a deep-scrub starts
Nov 29 01:20:58 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1a deep-scrub ok
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.gxdwyy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.gxdwyy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: Deploying daemon mds.cephfs.compute-2.gxdwyy on compute-2
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:58 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Nov 29 01:20:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e3 new map
Nov 29 01:21:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e3 print_map#012e3#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:19:35.589013+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.gxdwyy{-1:24145} state up:standby seq 1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Nov 29 01:21:00 np0005539509 ceph-mon[80754]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Nov 29 01:21:00 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 01:21:00 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 01:21:00 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 01:21:00 np0005539509 ceph-mon[80754]: from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:21:00 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:21:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e4 new map
Nov 29 01:21:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e4 print_map#012e4#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:00.645745+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:creating seq 1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Nov 29 01:21:00 np0005539509 radosgw[83442]: LDAP not started since no server URIs were provided in the configuration.
Nov 29 01:21:00 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-1-cbugbv[83438]: 2025-11-29T06:21:00.840+0000 7f2e761db940 -1 LDAP not started since no server URIs were provided in the configuration.
Nov 29 01:21:00 np0005539509 radosgw[83442]: framework: beast
Nov 29 01:21:00 np0005539509 radosgw[83442]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Nov 29 01:21:00 np0005539509 radosgw[83442]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Nov 29 01:21:00 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Nov 29 01:21:00 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Nov 29 01:21:00 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Nov 29 01:21:00 np0005539509 radosgw[83442]: starting handler: beast
Nov 29 01:21:00 np0005539509 radosgw[83442]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:21:00 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Nov 29 01:21:00 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Nov 29 01:21:00 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Nov 29 01:21:01 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Nov 29 01:21:01 np0005539509 radosgw[83442]: mgrc service_daemon_register rgw.24113 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.cbugbv,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=916ce3c8-b215-47fd-909b-03c5b552b52f,zone_name=default,zonegroup_id=a7fe8251-a74c-4f06-a680-d530d14bb192,zonegroup_name=default}
Nov 29 01:21:01 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Nov 29 01:21:02 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: daemon mds.cephfs.compute-2.gxdwyy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: Cluster is now healthy
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jzycnf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: daemon mds.cephfs.compute-2.gxdwyy is now active in filesystem cephfs as rank 0
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jzycnf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 01:21:02 np0005539509 ceph-mon[80754]: Deploying daemon mds.cephfs.compute-0.jzycnf on compute-0
Nov 29 01:21:02 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Nov 29 01:21:03 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Nov 29 01:21:03 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Nov 29 01:21:03 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e5 new map
Nov 29 01:21:03 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e5 print_map#012e5#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:01.949294+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Nov 29 01:21:04 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Nov 29 01:21:04 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Nov 29 01:21:04 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e6 new map
Nov 29 01:21:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e6 print_map#012e6#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:01.949294+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:06 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.c scrub starts
Nov 29 01:21:06 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.c scrub ok
Nov 29 01:21:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e7 new map
Nov 29 01:21:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e7 print_map#012e7#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:01.949294+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:07 np0005539509 podman[84201]: 2025-11-29 06:21:07.004227 +0000 UTC m=+0.041254897 container create cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:21:07 np0005539509 systemd[1]: Started libpod-conmon-cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9.scope.
Nov 29 01:21:07 np0005539509 podman[84201]: 2025-11-29 06:21:06.985016985 +0000 UTC m=+0.022044902 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:21:07 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:21:07 np0005539509 podman[84201]: 2025-11-29 06:21:07.100662522 +0000 UTC m=+0.137690439 container init cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Nov 29 01:21:07 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Nov 29 01:21:07 np0005539509 podman[84201]: 2025-11-29 06:21:07.110086918 +0000 UTC m=+0.147114815 container start cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:21:07 np0005539509 podman[84201]: 2025-11-29 06:21:07.115500326 +0000 UTC m=+0.152528243 container attach cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:21:07 np0005539509 heuristic_ganguly[84216]: 167 167
Nov 29 01:21:07 np0005539509 systemd[1]: libpod-cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9.scope: Deactivated successfully.
Nov 29 01:21:07 np0005539509 podman[84201]: 2025-11-29 06:21:07.121767578 +0000 UTC m=+0.158795475 container died cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Nov 29 01:21:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:21:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.vlqnad", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 01:21:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.vlqnad", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 01:21:07 np0005539509 ceph-mon[80754]: Deploying daemon mds.cephfs.compute-1.vlqnad on compute-1
Nov 29 01:21:07 np0005539509 systemd[1]: var-lib-containers-storage-overlay-560408556d2562ba69fedfaee23ef6fa418e09e6722fa23dca860156a1f80608-merged.mount: Deactivated successfully.
Nov 29 01:21:07 np0005539509 podman[84201]: 2025-11-29 06:21:07.168921274 +0000 UTC m=+0.205949171 container remove cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=heuristic_ganguly, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.39.3)
Nov 29 01:21:07 np0005539509 systemd[1]: libpod-conmon-cf04fba7bceaa17c4809ce677f57b8fa944c598a0a852f20102c1b30ee81c7a9.scope: Deactivated successfully.
Nov 29 01:21:07 np0005539509 systemd[1]: Reloading.
Nov 29 01:21:07 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:21:07 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:21:08 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Nov 29 01:21:08 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.e scrub starts
Nov 29 01:21:08 np0005539509 systemd[1]: Reloading.
Nov 29 01:21:08 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.e scrub ok
Nov 29 01:21:08 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:21:08 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:21:08 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:08 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:08 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:21:08 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:08 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:21:08 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:21:08 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:21:08 np0005539509 systemd[1]: Starting Ceph mds.cephfs.compute-1.vlqnad for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:21:08 np0005539509 podman[84364]: 2025-11-29 06:21:08.716535702 +0000 UTC m=+0.040291981 container create 08cd4b182b0bede8363cef69388c2d99ea69ceb89d1599b75d6802dac432aaae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-1-vlqnad, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Nov 29 01:21:08 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7178d6186c4b6b3a979c787166746a9f5bcffadb4a7d2f848b3ba10715393dfa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:21:08 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7178d6186c4b6b3a979c787166746a9f5bcffadb4a7d2f848b3ba10715393dfa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:21:08 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7178d6186c4b6b3a979c787166746a9f5bcffadb4a7d2f848b3ba10715393dfa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:21:08 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7178d6186c4b6b3a979c787166746a9f5bcffadb4a7d2f848b3ba10715393dfa/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.vlqnad supports timestamps until 2038 (0x7fffffff)
Nov 29 01:21:08 np0005539509 podman[84364]: 2025-11-29 06:21:08.78975001 +0000 UTC m=+0.113506319 container init 08cd4b182b0bede8363cef69388c2d99ea69ceb89d1599b75d6802dac432aaae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-1-vlqnad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:21:08 np0005539509 podman[84364]: 2025-11-29 06:21:08.698717575 +0000 UTC m=+0.022473894 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:21:08 np0005539509 podman[84364]: 2025-11-29 06:21:08.79598793 +0000 UTC m=+0.119744219 container start 08cd4b182b0bede8363cef69388c2d99ea69ceb89d1599b75d6802dac432aaae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-1-vlqnad, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Nov 29 01:21:08 np0005539509 bash[84364]: 08cd4b182b0bede8363cef69388c2d99ea69ceb89d1599b75d6802dac432aaae
Nov 29 01:21:08 np0005539509 systemd[1]: Started Ceph mds.cephfs.compute-1.vlqnad for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:21:08 np0005539509 ceph-mds[84384]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:21:08 np0005539509 ceph-mds[84384]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Nov 29 01:21:08 np0005539509 ceph-mds[84384]: main not setting numa affinity
Nov 29 01:21:08 np0005539509 ceph-mds[84384]: pidfile_write: ignore empty --pid-file
Nov 29 01:21:08 np0005539509 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-1-vlqnad[84380]: starting mds.cephfs.compute-1.vlqnad at 
Nov 29 01:21:09 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:10 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Updating MDS map to version 7 from mon.2
Nov 29 01:21:11 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Nov 29 01:21:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e8 new map
Nov 29 01:21:12 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Updating MDS map to version 8 from mon.2
Nov 29 01:21:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e8 print_map#012e8#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:01.949294+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:12 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Monitors have assigned me to become a standby.
Nov 29 01:21:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Nov 29 01:21:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:21:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:21:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:13 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 58 pg[10.0( v 54'96 (0'0,54'96] local-lis/les=49/50 n=8 ec=49/49 lis/c=49/49 les/c/f=50/50/0 sis=58 pruub=11.911335945s) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 54'95 mlcod 54'95 active pruub 169.060745239s@ mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:13 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 58 pg[10.0( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=49/49 lis/c=49/49 les/c/f=50/50/0 sis=58 pruub=11.911335945s) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 54'95 mlcod 0'0 unknown pruub 169.060745239s@ mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e58 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Nov 29 01:21:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:21:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1b( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1f( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.10( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.12( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.7( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1c( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1a( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.19( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.b( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1e( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.a( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.8( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.f( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.d( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.14( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.3( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.15( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.c( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.e( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.9( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.4( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.5( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1( v 54'96 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.6( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.2( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1d( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.17( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.16( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.11( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.18( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.13( v 54'96 lc 0'0 (0'0,54'96] local-lis/les=49/50 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1b( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.7( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.10( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.19( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.b( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.12( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1f( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1c( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1e( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.a( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.8( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.14( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.f( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.15( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.3( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.c( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.d( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.0( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=49/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 54'95 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.e( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.9( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.4( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.5( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1d( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.2( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.17( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.11( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.16( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1a( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.6( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.18( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.13( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:14 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 59 pg[10.1( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=49/49 les/c/f=50/50/0 sis=58) [0] r=0 lpr=58 pi=[49,58)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:16 np0005539509 ceph-mon[80754]: Deploying daemon haproxy.rgw.default.compute-0.zzbnoj on compute-0
Nov 29 01:21:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Nov 29 01:21:17 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Nov 29 01:21:17 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Nov 29 01:21:17 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:17 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e9 new map
Nov 29 01:21:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e9 print_map#012e9#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0119#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:17.214295+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 6 join_fscid=1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 4 join_fscid=1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:18 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Nov 29 01:21:18 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:19 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:20 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.d scrub starts
Nov 29 01:21:20 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.d scrub ok
Nov 29 01:21:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e10 new map
Nov 29 01:21:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).mds e10 print_map#012e10#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0119#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:17.214295+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 6 join_fscid=1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 4 join_fscid=1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 3 join_fscid=1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:21 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Updating MDS map to version 10 from mon.2
Nov 29 01:21:22 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.a scrub starts
Nov 29 01:21:22 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.a scrub ok
Nov 29 01:21:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Nov 29 01:21:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:24 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1b( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.569214821s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.522583008s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.12( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.575281143s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528656006s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.12( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.575191498s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528656006s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1b( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.569099426s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.522583008s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.10( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574449539s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528198242s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1e( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574452400s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528381348s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.10( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574231148s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528198242s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1e( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574331284s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528381348s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.19( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574158669s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528259277s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.19( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574132919s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528259277s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.8( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574046135s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528350830s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.8( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.574021339s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528350830s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.f( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573841095s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528411865s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.15( v 59'99 (0'0,59'99] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573786736s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 59'98 active pruub 182.528396606s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.14( v 59'99 (0'0,59'99] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573767662s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 59'98 active pruub 182.528396606s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.f( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573729515s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528411865s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.3( v 59'99 (0'0,59'99] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573698044s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 59'98 active pruub 182.528411865s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.14( v 59'99 (0'0,59'99] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573683739s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 0'0 unknown NOTIFY pruub 182.528396606s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.3( v 59'99 (0'0,59'99] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573577881s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 0'0 unknown NOTIFY pruub 182.528411865s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.15( v 59'99 (0'0,59'99] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.573624611s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'97 lcod 59'98 mlcod 0'0 unknown NOTIFY pruub 182.528396606s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.576788902s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.532211304s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.4( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572895050s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528457642s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.1( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.576555252s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.532211304s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.2( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572846413s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528533936s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.4( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572806358s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528457642s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.18( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.576239586s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.532165527s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.11( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572647095s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528610229s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.2( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572795868s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528533936s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.11( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572585106s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528610229s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.18( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.576163292s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.532165527s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.5( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572252274s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.528533936s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.13( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.575842857s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active pruub 182.532211304s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.13( v 54'96 (0'0,54'96] local-lis/les=58/59 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.575673103s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.532211304s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[10.5( v 54'96 (0'0,54'96] local-lis/les=58/59 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=14.572030067s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.528533936s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.17( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.14( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.10( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.12( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1e( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.1b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.7( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.4( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1d( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.4( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.f( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.5( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.8( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.14( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1c( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.19( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1a( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.12( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[11.1b( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 62 pg[8.18( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.003000079s ======
Nov 29 01:21:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:25.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000079s
Nov 29 01:21:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Nov 29 01:21:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Nov 29 01:21:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:25 np0005539509 systemd-logind[785]: New session 33 of user zuul.
Nov 29 01:21:25 np0005539509 systemd[1]: Started Session 33 of User zuul.
Nov 29 01:21:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Nov 29 01:21:25 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.14( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:25 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.12( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.7( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.f( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1c( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.5( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.4( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1( v 54'2 (0'0,54'2] local-lis/les=62/63 n=1 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.14( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.1b( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1b( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.18( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.8( v 46'4 lc 0'0 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.12( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1e( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.10( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.4( v 46'4 (0'0,46'4] local-lis/les=62/63 n=1 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.19( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1d( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[11.1a( v 54'2 (0'0,54'2] local-lis/les=62/63 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 63 pg[8.17( v 46'4 (0'0,46'4] local-lis/les=62/63 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [0] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:26 np0005539509 python3.9[84557]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:27.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:28 np0005539509 python3.9[84771]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:29 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Nov 29 01:21:29 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Nov 29 01:21:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Nov 29 01:21:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Nov 29 01:21:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:29.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:29 np0005539509 ceph-mon[80754]: Deploying daemon haproxy.rgw.default.compute-2.lpqgfx on compute-2
Nov 29 01:21:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:31.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:32 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Nov 29 01:21:32 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Nov 29 01:21:33 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Nov 29 01:21:33 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Nov 29 01:21:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:21:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:33.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:21:34 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:35 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Nov 29 01:21:35 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Nov 29 01:21:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Nov 29 01:21:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:35.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:36 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.e scrub starts
Nov 29 01:21:36 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.e scrub ok
Nov 29 01:21:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:36.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:37 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.3 deep-scrub starts
Nov 29 01:21:37 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.3 deep-scrub ok
Nov 29 01:21:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:37.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:38.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:39.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:39 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:39 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Nov 29 01:21:40 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Nov 29 01:21:40 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Nov 29 01:21:40 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Nov 29 01:21:40 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Nov 29 01:21:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:40.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:41.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:41 np0005539509 systemd[1]: session-33.scope: Deactivated successfully.
Nov 29 01:21:41 np0005539509 systemd[1]: session-33.scope: Consumed 9.823s CPU time.
Nov 29 01:21:41 np0005539509 systemd-logind[785]: Session 33 logged out. Waiting for processes to exit.
Nov 29 01:21:41 np0005539509 systemd-logind[785]: Removed session 33.
Nov 29 01:21:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:42.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:43.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:44 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.d scrub starts
Nov 29 01:21:44 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 4.d scrub ok
Nov 29 01:21:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:44.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:44 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Nov 29 01:21:44 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 01:21:44 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:44 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 01:21:44 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 01:21:44 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 01:21:44 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:45 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Nov 29 01:21:45 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Nov 29 01:21:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:45.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:46.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:47 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Nov 29 01:21:47 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Nov 29 01:21:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:47.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:48.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:48 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Nov 29 01:21:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Nov 29 01:21:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 01:21:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 01:21:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:49.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:50.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Nov 29 01:21:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Nov 29 01:21:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:51.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Nov 29 01:21:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Nov 29 01:21:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:52.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:53 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Nov 29 01:21:53 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Nov 29 01:21:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:53.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:21:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:54.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:21:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:54 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:54 np0005539509 ceph-mon[80754]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 29 01:21:54 np0005539509 ceph-mon[80754]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 29 01:21:54 np0005539509 ceph-mon[80754]: Deploying daemon keepalived.rgw.default.compute-2.klqjoa on compute-2
Nov 29 01:21:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Nov 29 01:21:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:55.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:56 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Nov 29 01:21:56 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Nov 29 01:21:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:56.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Nov 29 01:21:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:21:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:57.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:21:57 np0005539509 systemd-logind[785]: New session 34 of user zuul.
Nov 29 01:21:57 np0005539509 systemd[1]: Started Session 34 of User zuul.
Nov 29 01:21:58 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Nov 29 01:21:58 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Nov 29 01:21:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:58.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:58 np0005539509 python3.9[84981]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 29 01:21:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Nov 29 01:21:59 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Nov 29 01:21:59 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Nov 29 01:21:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:21:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:59.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:59 np0005539509 python3.9[85155]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:59 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Nov 29 01:21:59 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Nov 29 01:22:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:00.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:00 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Nov 29 01:22:00 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Nov 29 01:22:01 np0005539509 python3.9[85311]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:22:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:01.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:02.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:02 np0005539509 python3.9[85464]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:22:03 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.f scrub starts
Nov 29 01:22:03 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.f scrub ok
Nov 29 01:22:03 np0005539509 python3.9[85618]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:22:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:03.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:04.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:04 np0005539509 python3.9[85770]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:22:04 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:04 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Nov 29 01:22:05 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Nov 29 01:22:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:05.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:06 np0005539509 python3.9[85920]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:22:06 np0005539509 network[85937]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:22:06 np0005539509 network[85938]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:22:06 np0005539509 network[85939]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:22:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:06.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Nov 29 01:22:06 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Nov 29 01:22:06 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Nov 29 01:22:06 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Nov 29 01:22:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:07.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:07 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Nov 29 01:22:08 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:08 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:08 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Nov 29 01:22:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:08.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:09 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Nov 29 01:22:09 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 75 pg[9.1e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=75) [0] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:09 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 75 pg[9.6( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=75) [0] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:09 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 75 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=75) [0] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:09 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 75 pg[9.e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=75) [0] r=0 lpr=75 pi=[58,75)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:09 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:09 np0005539509 ceph-mon[80754]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 29 01:22:09 np0005539509 ceph-mon[80754]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 29 01:22:09 np0005539509 ceph-mon[80754]: Deploying daemon keepalived.rgw.default.compute-0.uyqrbs on compute-0
Nov 29 01:22:09 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Nov 29 01:22:09 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Nov 29 01:22:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:09.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:09 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:10.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:10 np0005539509 python3.9[86199]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:22:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Nov 29 01:22:10 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:10 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:10 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:10 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.6( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:10 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:10 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.6( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:10 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.1e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:10 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 76 pg[9.1e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=76) [0]/[1] r=-1 lpr=76 pi=[58,76)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:11.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:11 np0005539509 python3.9[86349]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:22:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Nov 29 01:22:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:12.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Nov 29 01:22:13 np0005539509 python3.9[86503]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:22:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Nov 29 01:22:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:13.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:14 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Nov 29 01:22:14 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Nov 29 01:22:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:14.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:14 np0005539509 python3.9[86663]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:22:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:15 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Nov 29 01:22:15 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Nov 29 01:22:15 np0005539509 python3.9[86747]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:22:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:15.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:16 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Nov 29 01:22:16 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Nov 29 01:22:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:16.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Nov 29 01:22:16 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.e( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:16 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:16 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:16 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.e( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:16 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:16 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:16 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.6( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:16 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 78 pg[9.6( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:17 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Nov 29 01:22:17 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Nov 29 01:22:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:17.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:18 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Nov 29 01:22:18 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:18.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:18 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 79 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:18 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 79 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:18 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 79 pg[9.6( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:18 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 79 pg[9.e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=6 ec=58/47 lis/c=76/58 les/c/f=77/59/0 sis=78) [0] r=0 lpr=78 pi=[58,78)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:19.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:19 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:19 np0005539509 podman[87036]: 2025-11-29 06:22:19.835354953 +0000 UTC m=+0.100195680 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Nov 29 01:22:19 np0005539509 podman[87036]: 2025-11-29 06:22:19.968773819 +0000 UTC m=+0.233614526 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 01:22:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:20.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:21.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:22.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:23.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:24.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:24 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Nov 29 01:22:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Nov 29 01:22:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:25.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:26.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:22:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Nov 29 01:22:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:22:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Nov 29 01:22:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Nov 29 01:22:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:27.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:27 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Nov 29 01:22:28 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.d scrub starts
Nov 29 01:22:28 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.d scrub ok
Nov 29 01:22:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:28.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:28 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Nov 29 01:22:28 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 82 pg[9.a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [0] r=0 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:28 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 82 pg[9.1a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [0] r=0 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:28 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Nov 29 01:22:28 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 83 pg[9.a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=83) [0]/[1] r=-1 lpr=83 pi=[58,83)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:28 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 83 pg[9.a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=83) [0]/[1] r=-1 lpr=83 pi=[58,83)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:28 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 83 pg[9.1a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=83) [0]/[1] r=-1 lpr=83 pi=[58,83)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:28 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 83 pg[9.1a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=83) [0]/[1] r=-1 lpr=83 pi=[58,83)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Nov 29 01:22:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Nov 29 01:22:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:29.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:22:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:30.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:22:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Nov 29 01:22:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:31.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:32.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:33.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:34.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:34 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:35.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:36.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:37 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Nov 29 01:22:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:37.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:38.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:39.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:39 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:40.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Nov 29 01:22:41 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 86 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:41 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 86 pg[9.a( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:41 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 86 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:41 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 86 pg[9.a( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:41.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:42.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:22:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:43.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:22:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:22:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:44.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:22:44 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Nov 29 01:22:44 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 87 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=5 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:44 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 87 pg[9.a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=6 ec=58/47 lis/c=83/58 les/c/f=84/59/0 sis=86) [0] r=0 lpr=86 pi=[58,86)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:44 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:22:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:45.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:22:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 01:22:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:46.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:47 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.c scrub starts
Nov 29 01:22:47 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.c scrub ok
Nov 29 01:22:47 np0005539509 ceph-mon[80754]: Reconfiguring mon.compute-0 (monmap changed)...
Nov 29 01:22:47 np0005539509 ceph-mon[80754]: Reconfiguring daemon mon.compute-0 on compute-0
Nov 29 01:22:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.vxabpq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 29 01:22:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:22:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:47.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:22:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:48.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:49 np0005539509 ceph-mon[80754]: Reconfiguring mgr.compute-0.vxabpq (monmap changed)...
Nov 29 01:22:49 np0005539509 ceph-mon[80754]: Reconfiguring daemon mgr.compute-0.vxabpq on compute-0
Nov 29 01:22:49 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Nov 29 01:22:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:49.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Nov 29 01:22:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:50.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:50 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Nov 29 01:22:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:51.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Nov 29 01:22:52 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Nov 29 01:22:52 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:52 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:52 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 01:22:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:22:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:52.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:22:53 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Nov 29 01:22:53 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 90 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90) [0] r=0 lpr=90 pi=[77,90)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:53 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 90 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90) [0] r=0 lpr=90 pi=[77,90)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:53.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:54.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:55 np0005539509 ceph-mon[80754]: Reconfiguring crash.compute-0 (monmap changed)...
Nov 29 01:22:55 np0005539509 ceph-mon[80754]: Reconfiguring daemon crash.compute-0 on compute-0
Nov 29 01:22:55 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Nov 29 01:22:55 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Nov 29 01:22:55 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Nov 29 01:22:55 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Nov 29 01:22:55 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Nov 29 01:22:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:55.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:22:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:56.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:22:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Nov 29 01:22:57 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 91 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=-1 lpr=91 pi=[77,91)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:57 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 91 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=-1 lpr=91 pi=[77,91)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:57 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 91 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=-1 lpr=91 pi=[77,91)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:57 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 91 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=-1 lpr=91 pi=[77,91)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:57.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:57 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Nov 29 01:22:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:58.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:22:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:59.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:00 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Nov 29 01:23:00 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Nov 29 01:23:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:00.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:01.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:02.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Nov 29 01:23:03 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.f scrub starts
Nov 29 01:23:03 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.f scrub ok
Nov 29 01:23:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:23:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:03.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:23:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:04.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:04 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:04 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Nov 29 01:23:04 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:04 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:04 np0005539509 ceph-mon[80754]: Reconfiguring osd.1 (monmap changed)...
Nov 29 01:23:04 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Nov 29 01:23:04 np0005539509 ceph-mon[80754]: Reconfiguring daemon osd.1 on compute-0
Nov 29 01:23:04 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:05 np0005539509 systemd[72642]: Created slice User Background Tasks Slice.
Nov 29 01:23:05 np0005539509 systemd[72642]: Starting Cleanup of User's Temporary Files and Directories...
Nov 29 01:23:05 np0005539509 systemd[72642]: Finished Cleanup of User's Temporary Files and Directories.
Nov 29 01:23:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:05.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:06 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.16 deep-scrub starts
Nov 29 01:23:06 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 5.16 deep-scrub ok
Nov 29 01:23:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:06.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Nov 29 01:23:06 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 93 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:06 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 93 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:06 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 93 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:06 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 93 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:07.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Nov 29 01:23:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Nov 29 01:23:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 01:23:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:08.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:09 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Nov 29 01:23:09 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Nov 29 01:23:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:09.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:09 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:10 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.6 scrub starts
Nov 29 01:23:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:10.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:10 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.6 scrub ok
Nov 29 01:23:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:23:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:11.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:23:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:12.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Nov 29 01:23:13 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 94 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94) [0] r=0 lpr=94 pi=[71,94)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:13 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 94 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94) [0] r=0 lpr=94 pi=[71,94)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:13 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 94 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=6 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:23:13 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 94 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=5 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93) [0] r=0 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:23:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:13.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 01:23:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 01:23:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.004000104s ======
Nov 29 01:23:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:14.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000104s
Nov 29 01:23:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:23:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:15.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:23:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:16.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Nov 29 01:23:17 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 01:23:17 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 01:23:17 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 01:23:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:17.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:18 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.7 scrub starts
Nov 29 01:23:18 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.7 scrub ok
Nov 29 01:23:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:18.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:18 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Nov 29 01:23:18 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 96 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=-1 lpr=96 pi=[71,96)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:18 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 96 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=-1 lpr=96 pi=[71,96)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:23:18 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 96 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=-1 lpr=96 pi=[71,96)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:18 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 96 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=-1 lpr=96 pi=[71,96)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:23:19 np0005539509 podman[87564]: 2025-11-29 06:23:19.121370023 +0000 UTC m=+0.066281664 container create 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:23:19 np0005539509 systemd[1]: Started libpod-conmon-5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc.scope.
Nov 29 01:23:19 np0005539509 podman[87564]: 2025-11-29 06:23:19.093807312 +0000 UTC m=+0.038719003 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:23:19 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:23:19 np0005539509 podman[87564]: 2025-11-29 06:23:19.23243987 +0000 UTC m=+0.177351561 container init 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:23:19 np0005539509 podman[87564]: 2025-11-29 06:23:19.244770482 +0000 UTC m=+0.189682123 container start 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Nov 29 01:23:19 np0005539509 frosty_hermann[87580]: 167 167
Nov 29 01:23:19 np0005539509 systemd[1]: libpod-5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc.scope: Deactivated successfully.
Nov 29 01:23:19 np0005539509 conmon[87580]: conmon 5b512b841815487c5bcd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc.scope/container/memory.events
Nov 29 01:23:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 01:23:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 01:23:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 01:23:19 np0005539509 podman[87564]: 2025-11-29 06:23:19.324977099 +0000 UTC m=+0.269888780 container attach 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3)
Nov 29 01:23:19 np0005539509 podman[87564]: 2025-11-29 06:23:19.32576032 +0000 UTC m=+0.270671951 container died 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:23:19 np0005539509 systemd[1]: var-lib-containers-storage-overlay-79bf26297953ee68c2a2437eccb93f5b341d84aebb3b99a9b7a8a5df41d5c522-merged.mount: Deactivated successfully.
Nov 29 01:23:19 np0005539509 podman[87564]: 2025-11-29 06:23:19.400211892 +0000 UTC m=+0.345123543 container remove 5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=frosty_hermann, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Nov 29 01:23:19 np0005539509 systemd[1]: libpod-conmon-5b512b841815487c5bcd9e69cb17b6d23ae37ea02ba32b5488f5bd84c23efbdc.scope: Deactivated successfully.
Nov 29 01:23:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:19.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:19 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:20 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.9 scrub starts
Nov 29 01:23:20 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.9 scrub ok
Nov 29 01:23:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:20.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Nov 29 01:23:21 np0005539509 ceph-mon[80754]: Reconfiguring crash.compute-1 (monmap changed)...
Nov 29 01:23:21 np0005539509 ceph-mon[80754]: Reconfiguring daemon crash.compute-1 on compute-1
Nov 29 01:23:21 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:21 np0005539509 podman[87718]: 2025-11-29 06:23:21.519983572 +0000 UTC m=+0.041705623 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:23:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:21.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:22.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:22 np0005539509 podman[87718]: 2025-11-29 06:23:22.437938499 +0000 UTC m=+0.959660490 container create 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Nov 29 01:23:22 np0005539509 systemd[1]: Started libpod-conmon-111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1.scope.
Nov 29 01:23:22 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:23:22 np0005539509 podman[87718]: 2025-11-29 06:23:22.53762832 +0000 UTC m=+1.059350361 container init 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Nov 29 01:23:22 np0005539509 podman[87718]: 2025-11-29 06:23:22.550378614 +0000 UTC m=+1.072100585 container start 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:23:22 np0005539509 zealous_curie[87853]: 167 167
Nov 29 01:23:22 np0005539509 systemd[1]: libpod-111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1.scope: Deactivated successfully.
Nov 29 01:23:22 np0005539509 podman[87718]: 2025-11-29 06:23:22.55545735 +0000 UTC m=+1.077179361 container attach 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Nov 29 01:23:22 np0005539509 podman[87718]: 2025-11-29 06:23:22.556434366 +0000 UTC m=+1.078156377 container died 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:23:22 np0005539509 systemd[1]: var-lib-containers-storage-overlay-0214373e0d899f4d6cc29ccd92286614bda1de2392412cf8e1699d3f67fc5d23-merged.mount: Deactivated successfully.
Nov 29 01:23:22 np0005539509 podman[87718]: 2025-11-29 06:23:22.602969667 +0000 UTC m=+1.124691638 container remove 111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zealous_curie, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Nov 29 01:23:22 np0005539509 systemd[1]: libpod-conmon-111cfed0d86f244fd744397c0b39f6feb71350903a201a19dfb93169a4ae4cd1.scope: Deactivated successfully.
Nov 29 01:23:22 np0005539509 python3.9[87865]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:23:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:23.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:23 np0005539509 ceph-mon[80754]: Reconfiguring osd.0 (monmap changed)...
Nov 29 01:23:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Nov 29 01:23:23 np0005539509 ceph-mon[80754]: Reconfiguring daemon osd.0 on compute-1
Nov 29 01:23:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:24.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:24 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:24 np0005539509 python3.9[88231]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 29 01:23:25 np0005539509 podman[88291]: 2025-11-29 06:23:25.041416218 +0000 UTC m=+0.056748828 container create 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Nov 29 01:23:25 np0005539509 systemd[1]: Started libpod-conmon-015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae.scope.
Nov 29 01:23:25 np0005539509 podman[88291]: 2025-11-29 06:23:25.016105337 +0000 UTC m=+0.031438017 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:23:25 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:23:25 np0005539509 podman[88291]: 2025-11-29 06:23:25.137243855 +0000 UTC m=+0.152576505 container init 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Nov 29 01:23:25 np0005539509 podman[88291]: 2025-11-29 06:23:25.14896645 +0000 UTC m=+0.164299090 container start 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:23:25 np0005539509 serene_brown[88331]: 167 167
Nov 29 01:23:25 np0005539509 podman[88291]: 2025-11-29 06:23:25.153590214 +0000 UTC m=+0.168922854 container attach 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 29 01:23:25 np0005539509 systemd[1]: libpod-015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae.scope: Deactivated successfully.
Nov 29 01:23:25 np0005539509 podman[88291]: 2025-11-29 06:23:25.154724375 +0000 UTC m=+0.170057005 container died 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:23:25 np0005539509 systemd[1]: var-lib-containers-storage-overlay-500356c243a1abf867e004dd832e7f7f95ce9826e67cbb576dadfec2a7e3635d-merged.mount: Deactivated successfully.
Nov 29 01:23:25 np0005539509 podman[88291]: 2025-11-29 06:23:25.209252532 +0000 UTC m=+0.224585152 container remove 015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=serene_brown, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:23:25 np0005539509 systemd[1]: libpod-conmon-015b42c8ba1b045be616711d3379bfe4e7ed8befa2af0ea7fb1e8d0248f560ae.scope: Deactivated successfully.
Nov 29 01:23:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:25 np0005539509 ceph-mon[80754]: Reconfiguring mon.compute-1 (monmap changed)...
Nov 29 01:23:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 01:23:25 np0005539509 ceph-mon[80754]: Reconfiguring daemon mon.compute-1 on compute-1
Nov 29 01:23:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:25.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:25 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.a scrub starts
Nov 29 01:23:25 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.a scrub ok
Nov 29 01:23:26 np0005539509 python3.9[88475]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 29 01:23:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:26.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Nov 29 01:23:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 98 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 98 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 98 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 98 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:26 np0005539509 python3.9[88627]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:23:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:27.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:28 np0005539509 python3.9[88779]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 29 01:23:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:28.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:29 np0005539509 python3.9[88931]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:23:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:29.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Nov 29 01:23:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 99 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=6 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:23:29 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 99 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98) [0] r=0 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:23:30 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.b scrub starts
Nov 29 01:23:30 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.b scrub ok
Nov 29 01:23:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:30.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:30 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:30 np0005539509 python3.9[89083]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:23:31 np0005539509 python3.9[89161]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:23:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:31.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:23:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:32.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:23:32 np0005539509 python3.9[89313]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:23:33 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.c scrub starts
Nov 29 01:23:33 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.c scrub ok
Nov 29 01:23:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:33.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:33 np0005539509 python3.9[89467]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 29 01:23:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:34.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:34 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:35 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.d scrub starts
Nov 29 01:23:35 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.d scrub ok
Nov 29 01:23:35 np0005539509 python3.9[89620]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 29 01:23:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:35.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:36.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:37 np0005539509 python3.9[89775]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:23:37 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Nov 29 01:23:37 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:37 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 01:23:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:37.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:38 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.e scrub starts
Nov 29 01:23:38 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.e scrub ok
Nov 29 01:23:38 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 100 pg[9.10( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=100) [0] r=0 lpr=100 pi=[58,100)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:38 np0005539509 ceph-mon[80754]: Reconfiguring mon.compute-2 (monmap changed)...
Nov 29 01:23:38 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 01:23:38 np0005539509 ceph-mon[80754]: Reconfiguring daemon mon.compute-2 on compute-2
Nov 29 01:23:38 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 01:23:38 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 01:23:38 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 01:23:38 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:38 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:38 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Nov 29 01:23:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:38.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:38 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Nov 29 01:23:38 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 101 pg[9.10( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=101) [0]/[1] r=-1 lpr=101 pi=[58,101)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:38 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 101 pg[9.10( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=101) [0]/[1] r=-1 lpr=101 pi=[58,101)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:23:38 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 101 pg[9.11( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=101) [0] r=0 lpr=101 pi=[58,101)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:38 np0005539509 python3.9[90002]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 29 01:23:39 np0005539509 podman[90127]: 2025-11-29 06:23:39.016616291 +0000 UTC m=+0.066699276 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Nov 29 01:23:39 np0005539509 podman[90127]: 2025-11-29 06:23:39.142830215 +0000 UTC m=+0.192913230 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:23:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:39.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:39 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:40 np0005539509 python3.9[90375]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:23:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:40.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:41.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:42 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 01:23:42 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 01:23:42 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Nov 29 01:23:42 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Nov 29 01:23:42 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 102 pg[9.11( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=102) [0]/[1] r=-1 lpr=102 pi=[58,102)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:42 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 102 pg[9.11( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=102) [0]/[1] r=-1 lpr=102 pi=[58,102)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:23:42 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.16 deep-scrub starts
Nov 29 01:23:42 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.16 deep-scrub ok
Nov 29 01:23:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:42.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:42 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Nov 29 01:23:42 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 103 pg[9.10( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=101/58 les/c/f=102/59/0 sis=103) [0] r=0 lpr=103 pi=[58,103)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:42 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 103 pg[9.10( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=101/58 les/c/f=102/59/0 sis=103) [0] r=0 lpr=103 pi=[58,103)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:42 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 103 pg[9.12( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=103) [0] r=0 lpr=103 pi=[58,103)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:43 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Nov 29 01:23:43 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Nov 29 01:23:43 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.17 deep-scrub starts
Nov 29 01:23:43 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.17 deep-scrub ok
Nov 29 01:23:43 np0005539509 python3.9[90658]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:23:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:43.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:44 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1a scrub starts
Nov 29 01:23:44 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1a scrub ok
Nov 29 01:23:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:44.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:44 np0005539509 python3.9[90810]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:23:44 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:45 np0005539509 python3.9[90889]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:23:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:23:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:45.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:23:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:46.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:46 np0005539509 python3.9[91041]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:23:47 np0005539509 python3.9[91119]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:23:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Nov 29 01:23:47 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 104 pg[9.11( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=102/58 les/c/f=103/59/0 sis=104) [0] r=0 lpr=104 pi=[58,104)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:47 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 104 pg[9.11( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=102/58 les/c/f=103/59/0 sis=104) [0] r=0 lpr=104 pi=[58,104)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:47 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 104 pg[9.12( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=104) [0]/[1] r=-1 lpr=104 pi=[58,104)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:47 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 104 pg[9.12( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=104) [0]/[1] r=-1 lpr=104 pi=[58,104)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:23:47 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 104 pg[9.10( v 56'1130 (0'0,56'1130] local-lis/les=103/104 n=6 ec=58/47 lis/c=101/58 les/c/f=102/59/0 sis=103) [0] r=0 lpr=103 pi=[58,103)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:23:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:23:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:47.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:23:48 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1c scrub starts
Nov 29 01:23:48 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1c scrub ok
Nov 29 01:23:48 np0005539509 python3.9[91271]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:23:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:23:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:23:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:48.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:49 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1d scrub starts
Nov 29 01:23:49 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1d scrub ok
Nov 29 01:23:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:23:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:49.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:23:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Nov 29 01:23:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:50.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 105 pg[9.11( v 56'1130 (0'0,56'1130] local-lis/les=104/105 n=6 ec=58/47 lis/c=102/58 les/c/f=103/59/0 sis=104) [0] r=0 lpr=104 pi=[58,104)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:23:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1f scrub starts
Nov 29 01:23:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 10.1f scrub ok
Nov 29 01:23:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:23:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:51.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:23:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:23:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:52.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:23:52 np0005539509 python3.9[91422]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:23:53 np0005539509 python3.9[91574]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 29 01:23:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:53.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:54.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:55 np0005539509 python3.9[91724]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:23:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:23:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:55.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:23:56 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.10 scrub starts
Nov 29 01:23:56 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.10 scrub ok
Nov 29 01:23:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:23:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:56.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:23:57 np0005539509 python3.9[91876]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:23:57 np0005539509 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 29 01:23:57 np0005539509 systemd[1]: tuned.service: Deactivated successfully.
Nov 29 01:23:57 np0005539509 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 29 01:23:57 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.14 scrub starts
Nov 29 01:23:57 np0005539509 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 01:23:57 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.14 scrub ok
Nov 29 01:23:57 np0005539509 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 01:23:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:23:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:57.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:23:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:23:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:58.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:23:59 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.17 scrub starts
Nov 29 01:23:59 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.17 scrub ok
Nov 29 01:23:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Nov 29 01:23:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 106 pg[9.12( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=4 ec=58/47 lis/c=104/58 les/c/f=105/59/0 sis=106) [0] r=0 lpr=106 pi=[58,106)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:59 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 106 pg[9.12( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=4 ec=58/47 lis/c=104/58 les/c/f=105/59/0 sis=106) [0] r=0 lpr=106 pi=[58,106)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:23:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:59.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:00 np0005539509 python3.9[92039]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 29 01:24:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:24:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:00.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:24:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:24:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:01.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:24:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:24:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:02.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:24:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:24:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:03.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:24:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:04.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:04 np0005539509 python3.9[92191]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:24:04 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Nov 29 01:24:05 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1e scrub starts
Nov 29 01:24:05 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1e scrub ok
Nov 29 01:24:05 np0005539509 python3.9[92345]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:24:05 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 107 pg[9.12( v 56'1130 (0'0,56'1130] local-lis/les=106/107 n=4 ec=58/47 lis/c=104/58 les/c/f=105/59/0 sis=106) [0] r=0 lpr=106 pi=[58,106)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:24:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 01:24:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 01:24:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 01:24:05 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 01:24:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:24:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:05.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:24:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:06.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:07 np0005539509 systemd[1]: session-34.scope: Deactivated successfully.
Nov 29 01:24:07 np0005539509 systemd[1]: session-34.scope: Consumed 1min 12.068s CPU time.
Nov 29 01:24:07 np0005539509 systemd-logind[785]: Session 34 logged out. Waiting for processes to exit.
Nov 29 01:24:07 np0005539509 systemd-logind[785]: Removed session 34.
Nov 29 01:24:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:07.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:08.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:09.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:09 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:24:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:10.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:24:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Nov 29 01:24:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 01:24:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:24:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 01:24:11 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.7 deep-scrub starts
Nov 29 01:24:11 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.7 deep-scrub ok
Nov 29 01:24:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:11.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:24:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:12.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:24:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:24:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:13.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:24:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Nov 29 01:24:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:24:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 01:24:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 01:24:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 01:24:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 01:24:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 01:24:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 01:24:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:14.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:15 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.1b scrub starts
Nov 29 01:24:15 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.1b scrub ok
Nov 29 01:24:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:15.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:16 np0005539509 systemd-logind[785]: New session 35 of user zuul.
Nov 29 01:24:16 np0005539509 systemd[1]: Started Session 35 of User zuul.
Nov 29 01:24:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:16.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:17 np0005539509 python3.9[92575]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:24:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:24:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:17.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:24:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:24:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:18.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:24:19 np0005539509 python3.9[92731]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 29 01:24:19 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.4 deep-scrub starts
Nov 29 01:24:19 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.4 deep-scrub ok
Nov 29 01:24:19 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Nov 29 01:24:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 01:24:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 01:24:19 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 110 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=110) [0] r=0 lpr=110 pi=[77,110)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:24:19 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:24:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:19.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:24:20 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.12 scrub starts
Nov 29 01:24:20 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.12 scrub ok
Nov 29 01:24:20 np0005539509 python3.9[92884]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:24:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:24:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:20.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:24:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Nov 29 01:24:21 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 111 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=111) [0]/[2] r=-1 lpr=111 pi=[77,111)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:21 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 111 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=111) [0]/[2] r=-1 lpr=111 pi=[77,111)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:21 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 111 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=111 pruub=13.241782188s) [2] r=-1 lpr=111 pi=[78,111)/1 crt=56'1130 mlcod 0'0 active pruub 358.085052490s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:21 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 111 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=111 pruub=13.241716385s) [2] r=-1 lpr=111 pi=[78,111)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 358.085052490s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:21 np0005539509 python3.9[92968]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:24:21 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1d scrub starts
Nov 29 01:24:21 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1d scrub ok
Nov 29 01:24:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:21.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:22 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.8 scrub starts
Nov 29 01:24:22 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.8 scrub ok
Nov 29 01:24:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:22.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:23.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 01:24:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 01:24:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 01:24:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 01:24:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 01:24:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Nov 29 01:24:24 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.f scrub starts
Nov 29 01:24:24 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.f scrub ok
Nov 29 01:24:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:24.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:24 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Nov 29 01:24:25 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 112 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=112) [2]/[0] r=0 lpr=112 pi=[78,112)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:25 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 112 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=112) [2]/[0] r=0 lpr=112 pi=[78,112)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:24:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:25.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:26 np0005539509 python3.9[93121]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:24:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:26.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:24:27 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.14 scrub starts
Nov 29 01:24:27 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.14 scrub ok
Nov 29 01:24:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:27.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:28 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.4 scrub starts
Nov 29 01:24:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000020s ======
Nov 29 01:24:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:28.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Nov 29 01:24:28 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.4 scrub ok
Nov 29 01:24:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 01:24:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 01:24:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 01:24:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Nov 29 01:24:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Nov 29 01:24:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Nov 29 01:24:29 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Nov 29 01:24:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:29.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:30 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1a scrub starts
Nov 29 01:24:30 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1a scrub ok
Nov 29 01:24:30 np0005539509 python3.9[93274]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:24:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000019s ======
Nov 29 01:24:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:30.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Nov 29 01:24:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Nov 29 01:24:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 113 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=4 ec=58/47 lis/c=111/77 les/c/f=112/79/0 sis=113) [0] r=0 lpr=113 pi=[77,113)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:30 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 113 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=4 ec=58/47 lis/c=111/77 les/c/f=112/79/0 sis=113) [0] r=0 lpr=113 pi=[77,113)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:24:31 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 113 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=112/113 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=112) [2]/[0] async=[2] r=0 lpr=112 pi=[78,112)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:24:31 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.19 scrub starts
Nov 29 01:24:31 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.19 scrub ok
Nov 29 01:24:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:31.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:32 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1c scrub starts
Nov 29 01:24:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:32.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:33 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.5 scrub starts
Nov 29 01:24:33 np0005539509 python3.9[93427]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:24:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:33.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:34 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.5 scrub ok
Nov 29 01:24:34 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1c scrub ok
Nov 29 01:24:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:34.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:34 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Nov 29 01:24:34 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 114 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=112/113 n=5 ec=58/47 lis/c=112/78 les/c/f=113/79/0 sis=114 pruub=12.418749809s) [2] async=[2] r=-1 lpr=114 pi=[78,114)/1 crt=56'1130 mlcod 56'1130 active pruub 370.969604492s@ mbc={255={}}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:34 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 114 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=112/113 n=5 ec=58/47 lis/c=112/78 les/c/f=113/79/0 sis=114 pruub=12.417335510s) [2] r=-1 lpr=114 pi=[78,114)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 370.969604492s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:35 np0005539509 python3.9[93580]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 29 01:24:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Nov 29 01:24:35 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1 deep-scrub starts
Nov 29 01:24:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:35 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 114 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=113/114 n=4 ec=58/47 lis/c=111/77 les/c/f=112/79/0 sis=113) [0] r=0 lpr=113 pi=[77,113)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:24:35 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1 deep-scrub ok
Nov 29 01:24:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:35.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:36 np0005539509 python3.9[93730]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:24:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:24:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:36.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:24:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Nov 29 01:24:37 np0005539509 python3.9[93888]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:37.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:38.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:39 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Nov 29 01:24:39 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Nov 29 01:24:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:39.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:40 np0005539509 python3.9[94041]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:24:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:40.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Nov 29 01:24:40 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Nov 29 01:24:40 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Nov 29 01:24:41 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Nov 29 01:24:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Nov 29 01:24:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:41.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:42 np0005539509 python3.9[94330]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:24:42 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.12 scrub starts
Nov 29 01:24:42 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.12 scrub ok
Nov 29 01:24:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:42.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:42 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Nov 29 01:24:43 np0005539509 python3.9[94480]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:24:43 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Nov 29 01:24:43 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 119 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=5 ec=58/47 lis/c=86/86 les/c/f=87/87/0 sis=119 pruub=8.766153336s) [1] r=-1 lpr=119 pi=[86,119)/1 crt=56'1130 mlcod 0'0 active pruub 376.257568359s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:43 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 119 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=5 ec=58/47 lis/c=86/86 les/c/f=87/87/0 sis=119 pruub=8.766060829s) [1] r=-1 lpr=119 pi=[86,119)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 376.257568359s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:43.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:44 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Nov 29 01:24:44 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Nov 29 01:24:44 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 120 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=5 ec=58/47 lis/c=86/86 les/c/f=87/87/0 sis=120) [1]/[0] r=0 lpr=120 pi=[86,120)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:44 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 120 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=86/87 n=5 ec=58/47 lis/c=86/86 les/c/f=87/87/0 sis=120) [1]/[0] r=0 lpr=120 pi=[86,120)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:24:44 np0005539509 python3.9[94635]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:44.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:45.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Nov 29 01:24:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Nov 29 01:24:46 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 121 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=120/121 n=5 ec=58/47 lis/c=86/86 les/c/f=87/87/0 sis=120) [1]/[0] async=[1] r=0 lpr=120 pi=[86,120)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:24:46 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1b scrub starts
Nov 29 01:24:46 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 11.1b scrub ok
Nov 29 01:24:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:46.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:47 np0005539509 python3.9[94788]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:47.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Nov 29 01:24:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Nov 29 01:24:48 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.18 scrub starts
Nov 29 01:24:48 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 8.18 scrub ok
Nov 29 01:24:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:48.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:49.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:50 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.e scrub starts
Nov 29 01:24:50 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.e scrub ok
Nov 29 01:24:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:50.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.1e scrub starts
Nov 29 01:24:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.1e scrub ok
Nov 29 01:24:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Nov 29 01:24:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 122 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=120/121 n=5 ec=58/47 lis/c=120/86 les/c/f=121/87/0 sis=122 pruub=10.483498573s) [1] async=[1] r=-1 lpr=122 pi=[86,122)/1 crt=56'1130 mlcod 56'1130 active pruub 386.142822266s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 122 pg[9.1a( v 56'1130 (0'0,56'1130] local-lis/les=120/121 n=5 ec=58/47 lis/c=120/86 les/c/f=121/87/0 sis=122 pruub=10.483215332s) [1] r=-1 lpr=122 pi=[86,122)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 386.142822266s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:24:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:51.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:24:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:52.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:52 np0005539509 python3.9[94943]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:24:53 np0005539509 python3.9[95097]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Nov 29 01:24:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:53.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Nov 29 01:24:54 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Nov 29 01:24:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:54.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:55 np0005539509 systemd[1]: session-35.scope: Deactivated successfully.
Nov 29 01:24:55 np0005539509 systemd[1]: session-35.scope: Consumed 19.310s CPU time.
Nov 29 01:24:55 np0005539509 systemd-logind[785]: Session 35 logged out. Waiting for processes to exit.
Nov 29 01:24:55 np0005539509 systemd-logind[785]: Removed session 35.
Nov 29 01:24:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:55.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:24:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:56.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:24:57 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.6 deep-scrub starts
Nov 29 01:24:57 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.6 deep-scrub ok
Nov 29 01:24:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:24:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:57.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:24:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:58.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Nov 29 01:24:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:24:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:59.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:00.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:01.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Nov 29 01:25:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:02.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:02 np0005539509 systemd-logind[785]: New session 36 of user zuul.
Nov 29 01:25:02 np0005539509 systemd[1]: Started Session 36 of User zuul.
Nov 29 01:25:03 np0005539509 python3.9[95280]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:25:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:03.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:04.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:05 np0005539509 python3.9[95434]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:25:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:05.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:06 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.a scrub starts
Nov 29 01:25:06 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.a scrub ok
Nov 29 01:25:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Nov 29 01:25:06 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Nov 29 01:25:06 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Nov 29 01:25:06 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Nov 29 01:25:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:06.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:06 np0005539509 python3.9[95629]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:25:07 np0005539509 systemd[1]: session-36.scope: Deactivated successfully.
Nov 29 01:25:07 np0005539509 systemd[1]: session-36.scope: Consumed 2.635s CPU time.
Nov 29 01:25:07 np0005539509 systemd-logind[785]: Session 36 logged out. Waiting for processes to exit.
Nov 29 01:25:07 np0005539509 systemd-logind[785]: Removed session 36.
Nov 29 01:25:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:07.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:08.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:08 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Nov 29 01:25:09 np0005539509 podman[95826]: 2025-11-29 06:25:09.003610094 +0000 UTC m=+0.082734655 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Nov 29 01:25:09 np0005539509 podman[95826]: 2025-11-29 06:25:09.112967198 +0000 UTC m=+0.192091809 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Nov 29 01:25:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:09.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:10.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:11.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:12.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:13.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Nov 29 01:25:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 01:25:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:14.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:15 np0005539509 systemd-logind[785]: New session 37 of user zuul.
Nov 29 01:25:15 np0005539509 systemd[1]: Started Session 37 of User zuul.
Nov 29 01:25:15 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:15 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 127 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=127 pruub=13.503620148s) [2] r=-1 lpr=127 pi=[93,127)/1 crt=56'1130 mlcod 0'0 active pruub 413.242828369s@ mbc={}] start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:25:15 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 127 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=127 pruub=13.503321648s) [2] r=-1 lpr=127 pi=[93,127)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 413.242828369s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:25:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:25:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:16.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:25:16 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.d scrub starts
Nov 29 01:25:16 np0005539509 python3.9[96099]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:25:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Nov 29 01:25:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 01:25:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 01:25:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 01:25:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 01:25:16 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:16.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:16 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.d scrub ok
Nov 29 01:25:17 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.f scrub starts
Nov 29 01:25:17 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.f scrub ok
Nov 29 01:25:17 np0005539509 python3.9[96253]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:25:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:18.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:18.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:18 np0005539509 python3.9[96537]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:25:19 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Nov 29 01:25:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 01:25:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 01:25:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 01:25:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Nov 29 01:25:19 np0005539509 python3.9[96621]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:25:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:25:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:20.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:25:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:20.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:21 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 129 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=0 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:25:21 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 129 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=93/94 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=0 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:25:21 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 129 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=129 pruub=9.221545219s) [1] r=-1 lpr=129 pi=[78,129)/1 crt=56'1130 mlcod 0'0 active pruub 414.086791992s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:25:21 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 129 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=129 pruub=9.221508980s) [1] r=-1 lpr=129 pi=[78,129)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 414.086791992s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:25:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:22.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Nov 29 01:25:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Nov 29 01:25:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:22 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Nov 29 01:25:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:25:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:22.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:25:23 np0005539509 python3.9[96774]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:25:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:25:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Nov 29 01:25:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:25:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:25:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:24.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:25:24 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Nov 29 01:25:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 131 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=131) [1]/[0] r=0 lpr=131 pi=[78,131)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:25:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 131 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=78/79 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=131) [1]/[0] r=0 lpr=131 pi=[78,131)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:25:24 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 131 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] async=[2] r=0 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:25:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:24.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Nov 29 01:25:25 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132 pruub=15.171678543s) [2] async=[2] r=-1 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 56'1130 active pruub 424.108947754s@ mbc={255={}}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:25:25 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132 pruub=15.171549797s) [2] r=-1 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 424.108947754s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:25:25 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 132 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=131/132 n=5 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=131) [1]/[0] async=[1] r=0 lpr=131 pi=[78,131)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:25:25 np0005539509 python3.9[96969]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:25:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:26.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:25:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:26 np0005539509 python3.9[97121]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:25:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Nov 29 01:25:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 133 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133 pruub=14.523239136s) [1] async=[1] r=-1 lpr=133 pi=[78,133)/1 crt=56'1130 mlcod 56'1130 active pruub 424.948211670s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:25:26 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 133 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133 pruub=14.522854805s) [1] r=-1 lpr=133 pi=[78,133)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 424.948211670s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:25:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:26.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:27 np0005539509 python3.9[97284]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:27 np0005539509 python3.9[97362]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Nov 29 01:25:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:28.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:28 np0005539509 python3.9[97514]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:28.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:28 np0005539509 python3.9[97592]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:25:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:30.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:30 np0005539509 python3.9[97744]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:25:30 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.1f deep-scrub starts
Nov 29 01:25:30 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.1f deep-scrub ok
Nov 29 01:25:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:30.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:30 np0005539509 python3.9[97896]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:25:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:31 np0005539509 python3.9[98048]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:25:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:25:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:32.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:25:32 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.10 scrub starts
Nov 29 01:25:32 np0005539509 python3.9[98200]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:25:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:32.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:33 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.10 scrub ok
Nov 29 01:25:33 np0005539509 python3.9[98352]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:25:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:34.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:34.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:36.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:36.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:37 np0005539509 python3.9[98505]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:25:37 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.11 scrub starts
Nov 29 01:25:37 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.11 scrub ok
Nov 29 01:25:37 np0005539509 python3.9[98659]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:25:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:38.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:25:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:38.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:25:38 np0005539509 python3.9[98811]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:25:39 np0005539509 python3.9[98963]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:25:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:40.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:40.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:40 np0005539509 python3.9[99116]: ansible-service_facts Invoked
Nov 29 01:25:40 np0005539509 network[99133]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:25:40 np0005539509 network[99134]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:25:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Nov 29 01:25:40 np0005539509 network[99135]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:25:41 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.898561478s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 active pruub 437.762023926s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:25:41 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:25:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:25:41 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:25:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:42.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:42.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:44.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:44.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:46.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:25:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Nov 29 01:25:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:25:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:25:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:25:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:25:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:46.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:46 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:25:46 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:25:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:25:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:48.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:25:48 np0005539509 python3.9[99587]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:25:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:48.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:50 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:25:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:50.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:50.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:25:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Nov 29 01:25:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:25:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:25:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:25:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:52.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:52 np0005539509 python3.9[99740]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 29 01:25:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:52.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:53 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:25:53 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:54.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:54 np0005539509 python3.9[99892]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:25:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:54.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:25:54 np0005539509 python3.9[100020]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:55 np0005539509 python3.9[100172]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:56.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:25:56 np0005539509 python3.9[100250]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:56.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:57 np0005539509 python3.9[100402]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:58.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:25:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:58.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Nov 29 01:25:59 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:26:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:00.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:00 np0005539509 python3.9[100554]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:26:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:00.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:01 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:01 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.835290909s) [1] async=[1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 56'1130 active pruub 461.014038086s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:26:01 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:26:01 np0005539509 python3.9[100638]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:26:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:02.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:02.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:02 np0005539509 systemd[1]: session-37.scope: Deactivated successfully.
Nov 29 01:26:02 np0005539509 systemd[1]: session-37.scope: Consumed 26.501s CPU time.
Nov 29 01:26:02 np0005539509 systemd-logind[785]: Session 37 logged out. Waiting for processes to exit.
Nov 29 01:26:02 np0005539509 systemd-logind[785]: Removed session 37.
Nov 29 01:26:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:04.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:04 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 e139: 3 total, 3 up, 3 in
Nov 29 01:26:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:04.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:06.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:06.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:08.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:08 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:08 np0005539509 systemd-logind[785]: New session 38 of user zuul.
Nov 29 01:26:08 np0005539509 systemd[1]: Started Session 38 of User zuul.
Nov 29 01:26:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:08.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:09 np0005539509 python3.9[100824]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:10.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:10 np0005539509 python3.9[100976]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:10.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:11 np0005539509 python3.9[101054]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:11 np0005539509 systemd[1]: session-38.scope: Deactivated successfully.
Nov 29 01:26:11 np0005539509 systemd[1]: session-38.scope: Consumed 1.842s CPU time.
Nov 29 01:26:11 np0005539509 systemd-logind[785]: Session 38 logged out. Waiting for processes to exit.
Nov 29 01:26:11 np0005539509 systemd-logind[785]: Removed session 38.
Nov 29 01:26:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:12.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:12.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:13 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:14.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.669173) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574669319, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7845, "num_deletes": 255, "total_data_size": 16680430, "memory_usage": 16933152, "flush_reason": "Manual Compaction"}
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Nov 29 01:26:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:14.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574761591, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 10246072, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 257, "largest_seqno": 7850, "table_properties": {"data_size": 10212483, "index_size": 22402, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10117, "raw_key_size": 96609, "raw_average_key_size": 24, "raw_value_size": 10133446, "raw_average_value_size": 2520, "num_data_blocks": 982, "num_entries": 4021, "num_filter_entries": 4021, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 1764397161, "file_creation_time": 1764397574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 92525 microseconds, and 26134 cpu microseconds.
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.761699) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 10246072 bytes OK
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.761725) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.767871) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.767886) EVENT_LOG_v1 {"time_micros": 1764397574767882, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.767903) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 16637428, prev total WAL file size 16638063, number of live WAL files 2.
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.771616) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10005KB) 8(1648B)]
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574771839, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 10247720, "oldest_snapshot_seqno": -1}
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3770 keys, 10242579 bytes, temperature: kUnknown
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574864128, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 10242579, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10209657, "index_size": 22380, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9477, "raw_key_size": 92431, "raw_average_key_size": 24, "raw_value_size": 10133726, "raw_average_value_size": 2687, "num_data_blocks": 982, "num_entries": 3770, "num_filter_entries": 3770, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764397574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.864451) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 10242579 bytes
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.867268) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 110.9 rd, 110.9 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(9.8, 0.0 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 4026, records dropped: 256 output_compression: NoCompression
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.867307) EVENT_LOG_v1 {"time_micros": 1764397574867277, "job": 4, "event": "compaction_finished", "compaction_time_micros": 92377, "compaction_time_cpu_micros": 33826, "output_level": 6, "num_output_files": 1, "total_output_size": 10242579, "num_input_records": 4026, "num_output_records": 3770, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574869538, "job": 4, "event": "table_file_deletion", "file_number": 14}
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574869587, "job": 4, "event": "table_file_deletion", "file_number": 8}
Nov 29 01:26:14 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:26:14.771348) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:26:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:16.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:16.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:17 np0005539509 systemd-logind[785]: New session 39 of user zuul.
Nov 29 01:26:17 np0005539509 systemd[1]: Started Session 39 of User zuul.
Nov 29 01:26:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:18.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:18 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:18.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:18 np0005539509 python3.9[101235]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:26:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:26:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:20.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:26:20 np0005539509 python3.9[101391]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:20.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:22 np0005539509 python3.9[101567]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:22.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:22 np0005539509 python3.9[101645]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.o1yfqgms recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:22.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:23 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:23 np0005539509 python3.9[101797]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:24.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:24 np0005539509 python3.9[101875]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.ybw6pfgn recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:24.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:24 np0005539509 python3.9[102027]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:26 np0005539509 python3.9[102179]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:26.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:26 np0005539509 python3.9[102257]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:26.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:27 np0005539509 python3.9[102409]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:27 np0005539509 python3.9[102487]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:28.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:28 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:28 np0005539509 python3.9[102639]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:28.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:29 np0005539509 python3.9[102791]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:30 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:26:30 np0005539509 python3.9[102869]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:30.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:30.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:30 np0005539509 python3.9[103021]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:31 np0005539509 python3.9[103099]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:32.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:32 np0005539509 python3.9[103251]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:26:32 np0005539509 systemd[1]: Reloading.
Nov 29 01:26:32 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:26:32 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:26:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:32.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:33 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:33 np0005539509 python3.9[103440]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:34.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:34 np0005539509 python3.9[103518]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:34.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:35 np0005539509 python3.9[103670]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:35 np0005539509 python3.9[103748]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:36.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:36 np0005539509 python3.9[103900]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:26:36 np0005539509 systemd[1]: Reloading.
Nov 29 01:26:36 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:26:36 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:26:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:36.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:36 np0005539509 systemd[1]: Starting Create netns directory...
Nov 29 01:26:36 np0005539509 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:26:36 np0005539509 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:26:36 np0005539509 systemd[1]: Finished Create netns directory.
Nov 29 01:26:37 np0005539509 python3.9[104093]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:26:37 np0005539509 network[104110]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:26:37 np0005539509 network[104111]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:26:37 np0005539509 network[104112]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:26:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:38.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:38 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:38.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:40.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:40.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:26:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:42.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:26:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:42.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:43 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:44.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:44 np0005539509 python3.9[104374]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:44.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:45 np0005539509 python3.9[104452]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:46 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:26:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:26:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:46.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:26:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:46.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:48.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:48 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:48.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:50 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:26:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:50.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:50 np0005539509 python3.9[104608]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:26:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:50.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:26:51 np0005539509 python3.9[104760]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).paxos(paxos updating c 252..773) lease_timeout -- calling new election
Nov 29 01:26:51 np0005539509 ceph-mon[80754]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Nov 29 01:26:51 np0005539509 ceph-mon[80754]: paxos.2).electionLogic(14) init, last seen epoch 14
Nov 29 01:26:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:26:52 np0005539509 python3.9[104838]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:52.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:26:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:52.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:53 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:26:53 np0005539509 python3.9[104990]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 01:26:53 np0005539509 systemd[1]: Starting Time & Date Service...
Nov 29 01:26:53 np0005539509 systemd[1]: Started Time & Date Service.
Nov 29 01:26:53 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:54.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:54 np0005539509 python3.9[105146]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:54.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:54 np0005539509 ceph-mon[80754]: mon.compute-1 calling monitor election
Nov 29 01:26:54 np0005539509 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 01:26:54 np0005539509 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 01:26:54 np0005539509 ceph-mon[80754]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 01:26:54 np0005539509 ceph-mon[80754]: overall HEALTH_OK
Nov 29 01:26:55 np0005539509 python3.9[105423]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:55 np0005539509 podman[105472]: 2025-11-29 06:26:55.620188535 +0000 UTC m=+0.309277261 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:26:55 np0005539509 podman[105472]: 2025-11-29 06:26:55.734051917 +0000 UTC m=+0.423140603 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3)
Nov 29 01:26:55 np0005539509 python3.9[105561]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:56.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:56 np0005539509 python3.9[105752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:56.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:56 np0005539509 python3.9[105900]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.o5ocs_xp recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:57 np0005539509 python3.9[106052]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:58.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:58 np0005539509 python3.9[106130]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:58 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:26:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:26:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:58.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:26:59 np0005539509 python3.9[106282]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:00.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:27:00 np0005539509 python3[106435]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:27:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:00.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:01 np0005539509 python3.9[106683]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:01 np0005539509 python3.9[106796]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:02.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:02 np0005539509 python3.9[106948]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:02.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:02 np0005539509 python3.9[107027]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:03 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:03 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:03 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:03 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:03 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:03 np0005539509 python3.9[107179]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:04.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:04 np0005539509 python3.9[107257]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:04.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:05 np0005539509 python3.9[107409]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:05 np0005539509 python3.9[107487]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:06.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:06.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:06 np0005539509 python3.9[107639]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:07 np0005539509 python3.9[107717]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:08.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:08 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:08.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:09 np0005539509 python3.9[107869]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:10.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:10 np0005539509 python3.9[108024]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:10.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:11 np0005539509 python3.9[108176]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:12.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:12 np0005539509 python3.9[108328]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:12.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:13 np0005539509 python3.9[108480]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:27:13 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:14 np0005539509 python3.9[108632]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:27:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:14.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:27:14 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:14.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:14 np0005539509 systemd[1]: session-39.scope: Deactivated successfully.
Nov 29 01:27:14 np0005539509 systemd[1]: session-39.scope: Consumed 32.826s CPU time.
Nov 29 01:27:14 np0005539509 systemd-logind[785]: Session 39 logged out. Waiting for processes to exit.
Nov 29 01:27:14 np0005539509 systemd-logind[785]: Removed session 39.
Nov 29 01:27:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:16.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:16.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:18.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:18 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:18.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:27:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:20.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:20.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:27:21 np0005539509 systemd-logind[785]: New session 40 of user zuul.
Nov 29 01:27:21 np0005539509 systemd[1]: Started Session 40 of User zuul.
Nov 29 01:27:21 np0005539509 python3.9[108817]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 29 01:27:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:22.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:22.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:22 np0005539509 python3.9[109019]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:27:23 np0005539509 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 01:27:23 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:23 np0005539509 python3.9[109173]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Nov 29 01:27:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:24.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:24 np0005539509 python3.9[109327]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.t0fyas5g follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:24.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:25 np0005539509 python3.9[109452]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.t0fyas5g mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397643.9492702-108-120182948309350/.source.t0fyas5g _original_basename=.q5o1tos6 follow=False checksum=b291f010aefff8b88f41011b780271a83fd1182f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:26.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:26 np0005539509 python3.9[109604]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:27:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:26.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:27 np0005539509 python3.9[109756]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2GXKCQiCwQEMihcSwDVeJtG2CpTemmA6MTbtOkxbB3OAV5PK8v8imPvDGMDurfGFQG0RzWyv9szlMJXdgIkwejIfy/AY7p6nemHOpu6DdAx0EA/jg1YcOIeeEhyMw1/oFzjYClGMohaI1oTKHtR29UXWphTAroOkf26Exvco6hh2ApRTXV9ObzSoOyCC7+OZcOWgYzdoCfu/0FDGkH2ksKLQS7d4AAh/XZ/njXhK57U7ptxHCReUPECGRv7KB4f8TelZDAIeUyp7ngd/9ivUDO1zue1Qr9ECzTzAFqippGXFmYl3+oSid03CY7bqnxav4xWt7UukbaO57goyIPfkklPdC1kA7kZqa9bqeDU1WgDkqnLu8hluArB0Y0Jz+hDfx9pTbAL6MklraoLaGrnrgcibAollAN+7WGqdWxUotENYaljO7P1Z18MlNllWFzk4Le5jMLNL8qArSlzM+ufOThnLdGEuYZhH1x969AisGQ4MQWn0P0lZFu6fE5VSNA/k=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDdPWx5WoFJTxz6PiFZL5f3XrtE682RjGFiIpoe0LXZO#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQlZMweHfLYiJFtm1r2tQze/oNx6KzgaXkK+Kof7POk0cFMLbTsXU8qgbQMh4o5LVO0Hbas4mAqxRkGcFCg2Po=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCX0dhB1m0xL0qEi5jnTQLLB4bvueVV5foNrqU/OkfV/4gRyp7uP2q21lWq5Dtl2GLk51pS6oD41RI41Y5g7OSRs8b1Z66d6X1QgX0Qns6pv7FwmNSQ25+2VGV6lppnaN5e+JHiwTmzpf82hl/MiiJrHo7B63mllKyl9SZJxUhP9RR4czS3QNYQsZyP7sZeCWothTZ2Q/GK4BWBEtj2+ifeOpa342IivopCH05YVQOx9bpsdFHMYaalMDCwvr2lfVns8aTcpJ3z9uE8wLdKWTyiinT7nuLX6RuPwhXB2proBRH1wrGSIUgcVcizkWn8QizD8LlsGFcHIQJkmq+sJz6r7cCZLIfS6hdAzI+hYbJie6n/agwfxe4r+mbXsmmC6ALKKk7CEnaiNnDg0fgTaUfBPwSfu+JmVrjdSO+S8f/CMbtYeO6QknOxhLV9oK6knszv7nLlSYXTzXanHkN4Y0fW3dsSvoE+qDR0YijbbT8slqMd6z95wWVDFUmTcN8Nzk8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILci1PI4hoB56+xxS5gSMKceuJ/dv6t7etpmtENwoSFr#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJIaOLr2ntjSUcigXC7a0sFoonsuh0ChCx2a1R6G8EDmJ8/ZB8NEiJE6KAQJDNU5XsXjuaC44eJhOUMRK9r98xA=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUVpPatup3d17omeiTdJaYR8jCcDbraJSPBxWy49Wxst4G+6/lD41HVIKmjgCgIbbmYSFBPQmoXt4gFXP4FRKna6AbQWi0kwF3/T2biQ2qCid0HVDSS8YRVlyrpdVc1/bIg6YNLkGnhzOMp0S1443+cg5PqutAbrAT1LOg6lSBu+K9gIqJ4un3l2guSweoyba5UhMyjrq4Pffx1QCuBggtYSjmA9Q1r5VVNc2J7AbP0QuzOe6J6DhpdGJsfmHDVXZb/4b/aPUdCTKkLseyUtcqElWVhhnGnpYSJdN81ejalSktGHE4JRHih19wwTokiKvoczUgijBzOfl+kt2ELcpDgzpzY0M9yd0Zz7wrK4rLM6hi8x3LYZXZv8N7KnawUcJ2jfzilx1BVLdNzgwDNB7ZlP4O9Vs3fKnBufCUFPNcRyWl6ooczepbgxqgSbr/Ham2O4/qzvJmzLtu0KxBkaFALRWnyM39nYVE/jrMKJ5ihtVDxIY9FGma/Jifg15gqI0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN19pK3a7AH/OiwlqJTVWP/qzU/QzkC16s4D1xY1Vn6J#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLsXsjJNPVMX1YVTe2oBmcZpUSiv3HOeuICgZtQun4hTopMXH9dE1jQeUruGwqZ+NsKW6X2bLZZJ0/tcn2owL8Q=#012 create=True mode=0644 path=/tmp/ansible.t0fyas5g state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:28.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:28 np0005539509 python3.9[109908]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.t0fyas5g' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:28 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:28.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:29 np0005539509 python3.9[110062]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.t0fyas5g state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:30 np0005539509 systemd-logind[785]: Session 40 logged out. Waiting for processes to exit.
Nov 29 01:27:30 np0005539509 systemd[1]: session-40.scope: Deactivated successfully.
Nov 29 01:27:30 np0005539509 systemd[1]: session-40.scope: Consumed 5.820s CPU time.
Nov 29 01:27:30 np0005539509 systemd-logind[785]: Removed session 40.
Nov 29 01:27:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:30.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:30.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:32.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:32.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:33 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:34.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:34.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:34 np0005539509 systemd-logind[785]: New session 41 of user zuul.
Nov 29 01:27:35 np0005539509 systemd[1]: Started Session 41 of User zuul.
Nov 29 01:27:36 np0005539509 python3.9[110240]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:27:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:36.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:36.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:37 np0005539509 python3.9[110396]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 01:27:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:38.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:38 np0005539509 python3.9[110550]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:27:38 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:38.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:39 np0005539509 python3.9[110703]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:40.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:40.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:27:41 np0005539509 python3.9[110856]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:27:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:42.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:42 np0005539509 python3.9[111008]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:42.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:42 np0005539509 systemd-logind[785]: Session 41 logged out. Waiting for processes to exit.
Nov 29 01:27:42 np0005539509 systemd[1]: session-41.scope: Deactivated successfully.
Nov 29 01:27:42 np0005539509 systemd[1]: session-41.scope: Consumed 4.317s CPU time.
Nov 29 01:27:42 np0005539509 systemd-logind[785]: Removed session 41.
Nov 29 01:27:43 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:44.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:44.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:46.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:46.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:48 np0005539509 systemd-logind[785]: New session 42 of user zuul.
Nov 29 01:27:48 np0005539509 systemd[1]: Started Session 42 of User zuul.
Nov 29 01:27:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:48.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:48.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:49 np0005539509 python3.9[111186]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:27:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:50.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:50 np0005539509 python3.9[111342]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:27:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:50.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:51 np0005539509 python3.9[111426]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:27:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:52.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:52.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:53 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:27:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:54.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:54.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:27:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:27:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:56.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:27:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:56.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:57 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:27:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:58.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:27:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:58.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:00.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:28:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:00.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:28:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:28:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:02.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:28:02 np0005539509 python3.9[111578]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:28:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:02.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:28:03 np0005539509 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:28:03 np0005539509 python3.9[111729]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:28:03 np0005539509 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:28:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:04.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:04 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:28:04 np0005539509 python3.9[111879]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:04.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:05 np0005539509 python3.9[112029]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:06 np0005539509 systemd-logind[785]: Session 42 logged out. Waiting for processes to exit.
Nov 29 01:28:06 np0005539509 systemd[1]: session-42.scope: Deactivated successfully.
Nov 29 01:28:06 np0005539509 systemd[1]: session-42.scope: Consumed 6.357s CPU time.
Nov 29 01:28:06 np0005539509 systemd-logind[785]: Removed session 42.
Nov 29 01:28:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:06.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:06.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:08.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:08 np0005539509 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 01:28:08 np0005539509 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 01:28:08 np0005539509 ceph-mon[80754]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 01:28:08 np0005539509 ceph-mon[80754]: overall HEALTH_OK
Nov 29 01:28:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:08.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:10.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:10.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:11 np0005539509 systemd-logind[785]: New session 43 of user zuul.
Nov 29 01:28:11 np0005539509 systemd[1]: Started Session 43 of User zuul.
Nov 29 01:28:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:12.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:12 np0005539509 python3.9[112207]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:28:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:12.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:14.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:14 np0005539509 python3.9[112363]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:14.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:15 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:15 np0005539509 python3.9[112515]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:16 np0005539509 python3.9[112667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:16.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:16 np0005539509 python3.9[112790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397695.4851682-163-217729628174384/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=c753fd50f03190549921f4ec9ebe197ccf1ffe37 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:16.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:17 np0005539509 python3.9[112942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:18.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:18 np0005539509 python3.9[113065]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397697.0490756-163-7455501384218/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=03c2952c2692ca442730881904078ac3e566f340 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:18.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:18 np0005539509 python3.9[113217]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:19 np0005539509 python3.9[113340]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397698.4946883-163-117985615079690/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=27db9be1e23c3016377de86e7cf7031ed01bcf2d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:20 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:20.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:20 np0005539509 python3.9[113492]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:28:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:20.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:28:22 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:28:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:22.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:22 np0005539509 python3.9[113644]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:22.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:23 np0005539509 python3.9[113917]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:23 np0005539509 python3.9[114040]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397702.792364-374-62534259531908/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=af9225c5d9213edb8553d0100161ec5ac71c6435 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:24.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:24 np0005539509 python3.9[114205]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:24.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:25 np0005539509 python3.9[114447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397704.0555966-374-199806165432541/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=446989bd92736b57ebc923ce429d8effafd00e68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:26 np0005539509 python3.9[114599]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:28:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:26 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:28:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:26.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:26 np0005539509 python3.9[114722]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397705.5784872-374-236808187045402/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=b27264d68008dc068de4ee4a6430b05babb8b7a6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:26.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:27 np0005539509 python3.9[114876]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:28 np0005539509 python3.9[115028]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:28.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:28.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:29 np0005539509 python3.9[115180]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:29 np0005539509 python3.9[115303]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397708.3308923-554-103063587327110/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=211e9c84831fe02b2c1e90a47350bc311a668a8e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:30.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:30 np0005539509 python3.9[115455]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:30.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:31 np0005539509 python3.9[115578]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397709.940725-554-184543457979669/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=446989bd92736b57ebc923ce429d8effafd00e68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:31 np0005539509 python3.9[115730]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:32.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:32 np0005539509 python3.9[115853]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397711.2354426-554-83206310104670/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=5d7eb31663823a154f0a44a495e71d206222dec7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:32.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:34 np0005539509 python3.9[116005]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:34.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:34 np0005539509 python3.9[116157]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:28:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:34.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:28:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:35 np0005539509 python3.9[116280]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397714.381309-771-91806314075671/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:28:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5840 writes, 25K keys, 5840 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5840 writes, 940 syncs, 6.21 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5840 writes, 25K keys, 5840 commit groups, 1.0 writes per commit group, ingest: 19.19 MB, 0.03 MB/s#012Interval WAL: 5840 writes, 940 syncs, 6.21 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Nov 29 01:28:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:36.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:36 np0005539509 python3.9[116432]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:36.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:37 np0005539509 python3.9[116584]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:37 np0005539509 python3.9[116709]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397716.6355143-849-130975966596119/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:38.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:38 np0005539509 python3.9[116861]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:38.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:39 np0005539509 python3.9[117013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:40 np0005539509 python3.9[117136]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397718.8209403-922-174509620318474/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:40.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:40 np0005539509 python3.9[117288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:40.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:41 np0005539509 python3.9[117440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:41 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:41 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:42 np0005539509 python3.9[117613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397721.024441-994-78353063345621/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:42.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:42.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:43 np0005539509 python3.9[117765]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:43 np0005539509 python3.9[117920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:44 np0005539509 python3.9[118043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397723.2541199-1059-236971876014537/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:44.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:44.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:45 np0005539509 python3.9[118195]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:45 np0005539509 python3.9[118347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:46.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:46 np0005539509 python3.9[118470]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397725.383341-1107-124750937152171/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:28:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:46.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:28:47 np0005539509 systemd[1]: session-43.scope: Deactivated successfully.
Nov 29 01:28:47 np0005539509 systemd[1]: session-43.scope: Consumed 25.559s CPU time.
Nov 29 01:28:47 np0005539509 systemd-logind[785]: Session 43 logged out. Waiting for processes to exit.
Nov 29 01:28:47 np0005539509 systemd-logind[785]: Removed session 43.
Nov 29 01:28:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:48.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:48.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:50.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:50.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:52.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:52 np0005539509 systemd-logind[785]: New session 44 of user zuul.
Nov 29 01:28:52 np0005539509 systemd[1]: Started Session 44 of User zuul.
Nov 29 01:28:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:52.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:53 np0005539509 python3.9[118651]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:54 np0005539509 python3.9[118803]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:54.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:28:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:54.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:28:55 np0005539509 python3.9[118926]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397733.799546-68-184508673086860/.source.conf _original_basename=ceph.conf follow=False checksum=b678e866ce48244e104f356f74865d3398155ff0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:55 np0005539509 python3.9[119078]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:56.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:56 np0005539509 python3.9[119201]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397735.4348497-68-77469776433554/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=d5bc1b1c0617b147c8e3e13846b179249a244079 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:56 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Nov 29 01:28:56 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:56.881358) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:28:56 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Nov 29 01:28:56 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736881526, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1353, "num_deletes": 253, "total_data_size": 3254957, "memory_usage": 3303176, "flush_reason": "Manual Compaction"}
Nov 29 01:28:56 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Nov 29 01:28:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:56 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736986792, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1349524, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7855, "largest_seqno": 9203, "table_properties": {"data_size": 1344764, "index_size": 2156, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12194, "raw_average_key_size": 20, "raw_value_size": 1334447, "raw_average_value_size": 2246, "num_data_blocks": 99, "num_entries": 594, "num_filter_entries": 594, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397574, "oldest_key_time": 1764397574, "file_creation_time": 1764397736, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:28:56 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 105507 microseconds, and 5986 cpu microseconds.
Nov 29 01:28:56 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:28:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:56.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:57 np0005539509 systemd[1]: session-44.scope: Deactivated successfully.
Nov 29 01:28:57 np0005539509 systemd[1]: session-44.scope: Consumed 3.149s CPU time.
Nov 29 01:28:57 np0005539509 systemd-logind[785]: Session 44 logged out. Waiting for processes to exit.
Nov 29 01:28:57 np0005539509 systemd-logind[785]: Removed session 44.
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:56.986876) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1349524 bytes OK
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:56.986910) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.078766) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.078846) EVENT_LOG_v1 {"time_micros": 1764397737078830, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.078885) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 3248581, prev total WAL file size 3264022, number of live WAL files 2.
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.124337) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323534' seq:0, type:0; will stop at (end)
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1317KB)], [15(10002KB)]
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397737124679, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 11592103, "oldest_snapshot_seqno": -1}
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3886 keys, 9449294 bytes, temperature: kUnknown
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397737453892, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9449294, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9417812, "index_size": 20684, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9733, "raw_key_size": 95405, "raw_average_key_size": 24, "raw_value_size": 9341875, "raw_average_value_size": 2403, "num_data_blocks": 911, "num_entries": 3886, "num_filter_entries": 3886, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764397737, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.454229) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9449294 bytes
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.497539) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 35.2 rd, 28.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 9.8 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(15.6) write-amplify(7.0) OK, records in: 4364, records dropped: 478 output_compression: NoCompression
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.497594) EVENT_LOG_v1 {"time_micros": 1764397737497573, "job": 6, "event": "compaction_finished", "compaction_time_micros": 329344, "compaction_time_cpu_micros": 31293, "output_level": 6, "num_output_files": 1, "total_output_size": 9449294, "num_input_records": 4364, "num_output_records": 3886, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397737498278, "job": 6, "event": "table_file_deletion", "file_number": 17}
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397737500810, "job": 6, "event": "table_file_deletion", "file_number": 15}
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.124196) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.500994) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.501001) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.501002) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.501004) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:28:57.501005) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:58.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:28:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:58.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:29:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:00.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:29:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:00.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:02.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:02 np0005539509 systemd-logind[785]: New session 45 of user zuul.
Nov 29 01:29:02 np0005539509 systemd[1]: Started Session 45 of User zuul.
Nov 29 01:29:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:02.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:04 np0005539509 python3.9[119379]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:29:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:04.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:04.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:05 np0005539509 python3.9[119535]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:06 np0005539509 python3.9[119687]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:06.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:29:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:07.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:29:07 np0005539509 python3.9[119837]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:29:07 np0005539509 python3.9[119989]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 01:29:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:08.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:09.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:10.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:11.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:29:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:12.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:29:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:13.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:13 np0005539509 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 29 01:29:14 np0005539509 python3.9[120148]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:29:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:14.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:14 np0005539509 python3.9[120232]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:29:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:15.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:16.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:17.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:18 np0005539509 python3.9[120385]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:29:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:18.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:19.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:20.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:20 np0005539509 python3[120540]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 29 01:29:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:21.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:21 np0005539509 python3.9[120692]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:21 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:29:21 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 1338 writes, 9364 keys, 1338 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 1338 writes, 1338 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1338 writes, 9364 keys, 1338 commit groups, 1.0 writes per commit group, ingest: 19.42 MB, 0.03 MB/s#012Interval WAL: 1338 writes, 1338 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     36.2      0.31              0.03         3    0.102       0      0       0.0       0.0#012  L6      1/0    9.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.7     49.4     44.5      0.42              0.07         2    0.211    8390    734       0.0       0.0#012 Sum      1/0    9.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.7     28.6     41.0      0.73              0.10         5    0.145    8390    734       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.7     33.6     48.1      0.62              0.10         4    0.155    8390    734       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     49.4     44.5      0.42              0.07         2    0.211    8390    734       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     55.8      0.20              0.03         2    0.099       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.011, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.03 GB write, 0.05 MB/s write, 0.02 GB read, 0.03 MB/s read, 0.7 seconds#012Interval compaction: 0.03 GB write, 0.05 MB/s write, 0.02 GB read, 0.03 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562155f711f0#2 capacity: 304.00 MB usage: 806.88 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 5.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(36,700.22 KB,0.224937%) FilterBlock(5,31.98 KB,0.0102746%) IndexBlock(5,74.67 KB,0.0239874%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 29 01:29:22 np0005539509 python3.9[120844]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:22.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:22 np0005539509 python3.9[120922]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:22 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:23.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:23 np0005539509 python3.9[121074]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:24 np0005539509 python3.9[121152]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.w9f0oyvv recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:24.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:25.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:25 np0005539509 python3.9[121304]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:25 np0005539509 python3.9[121382]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:26.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:27.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:27 np0005539509 python3.9[121534]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:28 np0005539509 python3[121687]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:29:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:29:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:28.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:29:28 np0005539509 python3.9[121839]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:29.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:29 np0005539509 python3.9[121964]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397768.4213958-437-17887342573909/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:30.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:30 np0005539509 python3.9[122116]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:31.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:31 np0005539509 python3.9[122241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397770.1671193-482-68565251634054/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:32.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:32 np0005539509 python3.9[122394]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:32 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:33.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:33 np0005539509 python3.9[122519]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397772.0004644-527-166895828604484/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:34 np0005539509 python3.9[122671]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:34.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:34 np0005539509 python3.9[122796]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397773.744868-572-108463550360372/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:35.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:36 np0005539509 python3.9[122948]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:36.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:36 np0005539509 python3.9[123073]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397775.5084586-617-211631853474894/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:37.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:37 np0005539509 python3.9[123225]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:37 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:38.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:38 np0005539509 python3.9[123377]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:39.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:39 np0005539509 python3.9[123532]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:40.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:40 np0005539509 python3.9[123684]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:29:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:41.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:29:41 np0005539509 python3.9[123837]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:29:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:42.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:42 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:42 np0005539509 python3.9[124108]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:43.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:43 np0005539509 python3.9[124277]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:44.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:45.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:45 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 01:29:45 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 01:29:46 np0005539509 python3.9[124428]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:29:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:29:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:46.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:47.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Nov 29 01:29:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:29:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:48.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:29:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:49.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:29:49 np0005539509 python3.9[124581]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:49 np0005539509 ovs-vsctl[124582]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 29 01:29:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:50.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:50 np0005539509 python3.9[124734]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:29:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:51.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:29:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:52.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:53.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:53 np0005539509 python3.9[124889]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:53 np0005539509 ovs-vsctl[124890]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 29 01:29:53 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:29:53 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:29:54 np0005539509 python3.9[125040]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:29:54 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:29:54 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:29:54 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:29:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:54.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:55 np0005539509 python3.9[125194]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:55.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:55 np0005539509 python3.9[125348]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:56 np0005539509 python3.9[125426]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:56.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:57.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:57 np0005539509 python3.9[125578]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:57 np0005539509 python3.9[125656]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:58.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:58 np0005539509 python3.9[125808]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:29:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:59.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:59 np0005539509 python3.9[125960]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:00 np0005539509 python3.9[126038]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:00.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:01 np0005539509 python3.9[126190]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:01.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:01 np0005539509 ceph-mon[80754]: overall HEALTH_OK
Nov 29 01:30:01 np0005539509 python3.9[126268]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:02 np0005539509 python3.9[126420]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:30:02 np0005539509 systemd[1]: Reloading.
Nov 29 01:30:02 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:02 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:02.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:03.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:03 np0005539509 python3.9[126611]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:04 np0005539509 python3.9[126689]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:04.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:04 np0005539509 python3.9[126841]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:05.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:05 np0005539509 python3.9[126921]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:06 np0005539509 python3.9[127073]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:30:06 np0005539509 systemd[1]: Reloading.
Nov 29 01:30:06 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:30:06 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:30:06 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:06 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:06.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:06 np0005539509 systemd[1]: Starting Create netns directory...
Nov 29 01:30:06 np0005539509 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:30:06 np0005539509 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:30:06 np0005539509 systemd[1]: Finished Create netns directory.
Nov 29 01:30:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:07.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:07 np0005539509 python3.9[127320]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:08.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:08 np0005539509 python3.9[127472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:09.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:09 np0005539509 python3.9[127595]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397808.1312945-1370-242769842265489/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:10 np0005539509 python3.9[127749]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000054s ======
Nov 29 01:30:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:10.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Nov 29 01:30:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:11.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:11 np0005539509 python3.9[127901]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:11 np0005539509 python3.9[128024]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397810.6662323-1445-246328890831847/.source.json _original_basename=.gfa_p6x1 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:12.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:12 np0005539509 python3.9[128176]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:13.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:13 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:30:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:14.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:15.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:15 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:15 np0005539509 python3.9[128603]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 29 01:30:16 np0005539509 python3.9[128755]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:30:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:16.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:17.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:17 np0005539509 python3.9[128907]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:30:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:18.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:19.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:19 np0005539509 python3[129086]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:30:20 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:20.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:21.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:22.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:23.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:24.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:25.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:26 np0005539509 podman[129099]: 2025-11-29 06:30:26.043950194 +0000 UTC m=+6.150896444 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:30:26 np0005539509 podman[129219]: 2025-11-29 06:30:26.200024555 +0000 UTC m=+0.054483990 container create e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 01:30:26 np0005539509 podman[129219]: 2025-11-29 06:30:26.176970204 +0000 UTC m=+0.031429659 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:30:26 np0005539509 python3[129086]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:30:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:26.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:27.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:28.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:29.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:30.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:30 np0005539509 python3.9[129409]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:31.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:31 np0005539509 python3.9[129563]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:32.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:32 np0005539509 python3.9[129639]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:33.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:33 np0005539509 python3.9[129795]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397832.9650795-1709-213072460580755/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:34 np0005539509 python3.9[129871]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:30:34 np0005539509 systemd[1]: Reloading.
Nov 29 01:30:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:34.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:34 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:34 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:35.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:35 np0005539509 python3.9[129983]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:30:35 np0005539509 systemd[1]: Reloading.
Nov 29 01:30:35 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:35 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:35 np0005539509 systemd[1]: Starting ovn_controller container...
Nov 29 01:30:35 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:30:36 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d06215a7c9e2f4d808980f1813b1b8a04e986648f3584f9f2e3ba032b924b4a8/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 01:30:36 np0005539509 systemd[1]: Started /usr/bin/podman healthcheck run e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275.
Nov 29 01:30:36 np0005539509 podman[130024]: 2025-11-29 06:30:36.03736806 +0000 UTC m=+0.124107610 container init e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: + sudo -E kolla_set_configs
Nov 29 01:30:36 np0005539509 podman[130024]: 2025-11-29 06:30:36.065601973 +0000 UTC m=+0.152341433 container start e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 01:30:36 np0005539509 edpm-start-podman-container[130024]: ovn_controller
Nov 29 01:30:36 np0005539509 systemd[1]: Created slice User Slice of UID 0.
Nov 29 01:30:36 np0005539509 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 29 01:30:36 np0005539509 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 29 01:30:36 np0005539509 systemd[1]: Starting User Manager for UID 0...
Nov 29 01:30:36 np0005539509 edpm-start-podman-container[130023]: Creating additional drop-in dependency for "ovn_controller" (e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275)
Nov 29 01:30:36 np0005539509 podman[130046]: 2025-11-29 06:30:36.150551425 +0000 UTC m=+0.074857051 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:30:36 np0005539509 systemd[1]: e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275-5d0a1a16db78e533.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:30:36 np0005539509 systemd[1]: e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275-5d0a1a16db78e533.service: Failed with result 'exit-code'.
Nov 29 01:30:36 np0005539509 systemd[1]: Reloading.
Nov 29 01:30:36 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:36 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:36 np0005539509 systemd[130077]: Queued start job for default target Main User Target.
Nov 29 01:30:36 np0005539509 systemd[130077]: Created slice User Application Slice.
Nov 29 01:30:36 np0005539509 systemd[130077]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 29 01:30:36 np0005539509 systemd[130077]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:30:36 np0005539509 systemd[130077]: Reached target Paths.
Nov 29 01:30:36 np0005539509 systemd[130077]: Reached target Timers.
Nov 29 01:30:36 np0005539509 systemd[130077]: Starting D-Bus User Message Bus Socket...
Nov 29 01:30:36 np0005539509 systemd[130077]: Starting Create User's Volatile Files and Directories...
Nov 29 01:30:36 np0005539509 systemd[130077]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:30:36 np0005539509 systemd[130077]: Reached target Sockets.
Nov 29 01:30:36 np0005539509 systemd[130077]: Finished Create User's Volatile Files and Directories.
Nov 29 01:30:36 np0005539509 systemd[130077]: Reached target Basic System.
Nov 29 01:30:36 np0005539509 systemd[130077]: Reached target Main User Target.
Nov 29 01:30:36 np0005539509 systemd[130077]: Startup finished in 168ms.
Nov 29 01:30:36 np0005539509 systemd[1]: Started User Manager for UID 0.
Nov 29 01:30:36 np0005539509 systemd[1]: Started ovn_controller container.
Nov 29 01:30:36 np0005539509 systemd[1]: Started Session c1 of User root.
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: INFO:__main__:Validating config file
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: INFO:__main__:Writing out command to execute
Nov 29 01:30:36 np0005539509 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: ++ cat /run_command
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: + ARGS=
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: + sudo kolla_copy_cacerts
Nov 29 01:30:36 np0005539509 systemd[1]: Started Session c2 of User root.
Nov 29 01:30:36 np0005539509 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: + [[ ! -n '' ]]
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: + . kolla_extend_start
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: + umask 0022
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 29 01:30:36 np0005539509 NetworkManager[49015]: <info>  [1764397836.6425] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Nov 29 01:30:36 np0005539509 NetworkManager[49015]: <info>  [1764397836.6432] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:30:36 np0005539509 NetworkManager[49015]: <info>  [1764397836.6443] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 29 01:30:36 np0005539509 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:30:36 np0005539509 NetworkManager[49015]: <info>  [1764397836.6448] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Nov 29 01:30:36 np0005539509 NetworkManager[49015]: <info>  [1764397836.6451] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 01:30:36 np0005539509 kernel: br-int: entered promiscuous mode
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 29 01:30:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:36.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:30:36 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:36Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:30:36 np0005539509 NetworkManager[49015]: <info>  [1764397836.6755] manager: (ovn-e15f55-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 29 01:30:36 np0005539509 systemd-udevd[130173]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:30:36 np0005539509 kernel: genev_sys_6081: entered promiscuous mode
Nov 29 01:30:36 np0005539509 NetworkManager[49015]: <info>  [1764397836.6955] device (genev_sys_6081): carrier: link connected
Nov 29 01:30:36 np0005539509 NetworkManager[49015]: <info>  [1764397836.6957] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Nov 29 01:30:36 np0005539509 systemd-udevd[130175]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:30:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:37.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:38.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:39.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:39 np0005539509 NetworkManager[49015]: <info>  [1764397839.5403] manager: (ovn-fa6f2e-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Nov 29 01:30:40 np0005539509 NetworkManager[49015]: <info>  [1764397840.0786] manager: (ovn-93db78-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Nov 29 01:30:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:40 np0005539509 python3.9[130306]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:40 np0005539509 ovs-vsctl[130307]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 29 01:30:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:40.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:41.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:41 np0005539509 python3.9[130459]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:41 np0005539509 ovs-vsctl[130461]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 29 01:30:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:42.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:42 np0005539509 python3.9[130614]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:42 np0005539509 ovs-vsctl[130615]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 29 01:30:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:43.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:43 np0005539509 systemd[1]: session-45.scope: Deactivated successfully.
Nov 29 01:30:43 np0005539509 systemd[1]: session-45.scope: Consumed 1min 244ms CPU time.
Nov 29 01:30:43 np0005539509 systemd-logind[785]: Session 45 logged out. Waiting for processes to exit.
Nov 29 01:30:43 np0005539509 systemd-logind[785]: Removed session 45.
Nov 29 01:30:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:44.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:45.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:46.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:46 np0005539509 systemd[1]: Stopping User Manager for UID 0...
Nov 29 01:30:46 np0005539509 systemd[130077]: Activating special unit Exit the Session...
Nov 29 01:30:46 np0005539509 systemd[130077]: Stopped target Main User Target.
Nov 29 01:30:46 np0005539509 systemd[130077]: Stopped target Basic System.
Nov 29 01:30:46 np0005539509 systemd[130077]: Stopped target Paths.
Nov 29 01:30:46 np0005539509 systemd[130077]: Stopped target Sockets.
Nov 29 01:30:46 np0005539509 systemd[130077]: Stopped target Timers.
Nov 29 01:30:46 np0005539509 systemd[130077]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:30:46 np0005539509 systemd[130077]: Closed D-Bus User Message Bus Socket.
Nov 29 01:30:46 np0005539509 systemd[130077]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:30:46 np0005539509 systemd[130077]: Removed slice User Application Slice.
Nov 29 01:30:46 np0005539509 systemd[130077]: Reached target Shutdown.
Nov 29 01:30:46 np0005539509 systemd[130077]: Finished Exit the Session.
Nov 29 01:30:46 np0005539509 systemd[130077]: Reached target Exit the Session.
Nov 29 01:30:46 np0005539509 systemd[1]: user@0.service: Deactivated successfully.
Nov 29 01:30:46 np0005539509 systemd[1]: Stopped User Manager for UID 0.
Nov 29 01:30:46 np0005539509 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 29 01:30:46 np0005539509 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 29 01:30:46 np0005539509 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 29 01:30:46 np0005539509 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 29 01:30:46 np0005539509 systemd[1]: Removed slice User Slice of UID 0.
Nov 29 01:30:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:47.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:47 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:47Z|00025|memory|INFO|15872 kB peak resident set size after 11.0 seconds
Nov 29 01:30:47 np0005539509 ovn_controller[130039]: 2025-11-29T06:30:47Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Nov 29 01:30:48 np0005539509 systemd-logind[785]: New session 47 of user zuul.
Nov 29 01:30:48 np0005539509 systemd[1]: Started Session 47 of User zuul.
Nov 29 01:30:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:48.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:49.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:49 np0005539509 python3.9[130796]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:30:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:50.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:51.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:52.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:52 np0005539509 python3.9[130952]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:30:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:53.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:30:53 np0005539509 python3.9[131104]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:54 np0005539509 python3.9[131256]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:54.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:54 np0005539509 python3.9[131409]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:55.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:55 np0005539509 python3.9[131561]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:56 np0005539509 python3.9[131711]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:30:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:56.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:57.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:57 np0005539509 python3.9[131863]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 01:30:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:58.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:59 np0005539509 python3.9[132013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:30:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:59.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:59 np0005539509 python3.9[132134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397858.4564419-224-157402102927445/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:00 np0005539509 python3.9[132284]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:00.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:01 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Nov 29 01:31:01 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Nov 29 01:31:01 np0005539509 python3.9[132405]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397859.992953-269-263644126277502/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:01.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:01 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Nov 29 01:31:01 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Nov 29 01:31:02 np0005539509 python3.9[132557]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:31:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:02.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:03 np0005539509 python3.9[132641]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:31:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:03.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:04.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:05.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:05 np0005539509 python3.9[132794]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:31:06 np0005539509 podman[132871]: 2025-11-29 06:31:06.426909742 +0000 UTC m=+0.142793105 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 01:31:06 np0005539509 python3.9[132974]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:31:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:06.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:31:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:07.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:07 np0005539509 python3.9[133101]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397866.1817796-380-140182130432464/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:07 np0005539509 python3.9[133366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:08 np0005539509 python3.9[133496]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397867.419116-380-254305290288700/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:08.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:09.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:09 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:31:09 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:31:10 np0005539509 python3.9[133646]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:10 np0005539509 python3.9[133767]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397869.5160851-512-258296720298783/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:10.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:10 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 01:31:10 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:31:10 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:31:10 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:31:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:11.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:11 np0005539509 python3.9[133917]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:12 np0005539509 python3.9[134038]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397870.8335085-512-189465354089049/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:31:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:12.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:31:12 np0005539509 python3.9[134188]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:31:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:13.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:13 np0005539509 python3.9[134342]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:14 np0005539509 python3.9[134494]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:14.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:14 np0005539509 python3.9[134572]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:15.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:15 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:15 np0005539509 python3.9[134724]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:16 np0005539509 python3.9[134802]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:16.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:17 np0005539509 python3.9[134954]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:17.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:17 np0005539509 python3.9[135106]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:18 np0005539509 python3.9[135184]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:18.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:19.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:19 np0005539509 python3.9[135336]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:20 np0005539509 python3.9[135414]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:20 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:20.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:21.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:21 np0005539509 python3.9[135566]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:31:21 np0005539509 systemd[1]: Reloading.
Nov 29 01:31:21 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:31:21 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:22 np0005539509 python3.9[135755]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:22.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:22 np0005539509 python3.9[135833]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:23.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:23 np0005539509 python3.9[135985]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:24 np0005539509 python3.9[136063]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:31:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:24.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:31:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:25.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:25 np0005539509 python3.9[136215]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:31:25 np0005539509 systemd[1]: Reloading.
Nov 29 01:31:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:25 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:31:25 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:25 np0005539509 systemd[1]: Starting Create netns directory...
Nov 29 01:31:25 np0005539509 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:31:25 np0005539509 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:31:25 np0005539509 systemd[1]: Finished Create netns directory.
Nov 29 01:31:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:31:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:26.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:31:27 np0005539509 python3.9[136409]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:27.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:27 np0005539509 python3.9[136561]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:28 np0005539509 python3.9[136684]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397887.4206333-965-64710958266798/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:28.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:29.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:29 np0005539509 python3.9[136836]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:30 np0005539509 python3.9[136988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:30.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:31 np0005539509 python3.9[137111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397889.995605-1040-72687177317324/.source.json _original_basename=._gnf33cx follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.203689) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891203890, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1376, "num_deletes": 252, "total_data_size": 3197835, "memory_usage": 3240768, "flush_reason": "Manual Compaction"}
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891224566, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 2087601, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9208, "largest_seqno": 10579, "table_properties": {"data_size": 2081808, "index_size": 3124, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12444, "raw_average_key_size": 19, "raw_value_size": 2069865, "raw_average_value_size": 3254, "num_data_blocks": 144, "num_entries": 636, "num_filter_entries": 636, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397737, "oldest_key_time": 1764397737, "file_creation_time": 1764397891, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 20919 microseconds, and 8458 cpu microseconds.
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.224628) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 2087601 bytes OK
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.224651) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.227122) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.227142) EVENT_LOG_v1 {"time_micros": 1764397891227135, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.227162) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 3191443, prev total WAL file size 3191443, number of live WAL files 2.
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.228320) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(2038KB)], [18(9227KB)]
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891228429, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 11536895, "oldest_snapshot_seqno": -1}
Nov 29 01:31:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:31.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4002 keys, 9547481 bytes, temperature: kUnknown
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891302200, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 9547481, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9515989, "index_size": 20374, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10053, "raw_key_size": 98620, "raw_average_key_size": 24, "raw_value_size": 9438770, "raw_average_value_size": 2358, "num_data_blocks": 889, "num_entries": 4002, "num_filter_entries": 4002, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764397891, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.302734) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 9547481 bytes
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.304408) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 155.7 rd, 128.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 9.0 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(10.1) write-amplify(4.6) OK, records in: 4522, records dropped: 520 output_compression: NoCompression
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.304431) EVENT_LOG_v1 {"time_micros": 1764397891304421, "job": 8, "event": "compaction_finished", "compaction_time_micros": 74110, "compaction_time_cpu_micros": 25635, "output_level": 6, "num_output_files": 1, "total_output_size": 9547481, "num_input_records": 4522, "num_output_records": 4002, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891305290, "job": 8, "event": "table_file_deletion", "file_number": 20}
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891307341, "job": 8, "event": "table_file_deletion", "file_number": 18}
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.228101) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.307584) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.307594) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.307597) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.307600) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:31 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:31:31.307602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:32 np0005539509 python3.9[137263]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:32.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:33.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:34.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:31:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:35.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:31:35 np0005539509 python3.9[137690]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 29 01:31:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:36 np0005539509 python3.9[137844]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:31:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:36.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:37.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:37 np0005539509 podman[137869]: 2025-11-29 06:31:37.380468316 +0000 UTC m=+0.122874980 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 01:31:38 np0005539509 python3.9[138021]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:31:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:38.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:31:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:39.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:31:40 np0005539509 python3[138200]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:31:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:31:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:40.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:31:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:41.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:42.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:31:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:43.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:31:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:44.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:45.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:46.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:31:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:47.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:48.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:49.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:50.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:31:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:31:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:51.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:31:51 np0005539509 podman[138213]: 2025-11-29 06:31:51.561648508 +0000 UTC m=+11.292585137 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:31:51 np0005539509 podman[138397]: 2025-11-29 06:31:51.729919259 +0000 UTC m=+0.052782612 container create b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:31:51 np0005539509 podman[138397]: 2025-11-29 06:31:51.699180936 +0000 UTC m=+0.022044339 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:31:51 np0005539509 python3[138200]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:31:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:52.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:53.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:54.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:55.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:56.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:57.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:58.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:31:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:59.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:32:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:00.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:32:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:01.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:02.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:03.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:04.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:05.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:06.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:07.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:08 np0005539509 podman[138463]: 2025-11-29 06:32:08.404409203 +0000 UTC m=+0.138401523 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:32:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:08.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:08 np0005539509 python3.9[138616]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:32:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:09.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:09 np0005539509 python3.9[138770]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:10 np0005539509 python3.9[138846]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:32:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:10.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:10 np0005539509 python3.9[138997]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397930.208002-1304-140102981237999/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:11.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:11 np0005539509 python3.9[139073]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:32:11 np0005539509 systemd[1]: Reloading.
Nov 29 01:32:11 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:11 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:12.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:13 np0005539509 python3.9[139186]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:13 np0005539509 systemd[1]: Reloading.
Nov 29 01:32:13 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:13 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:32:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:13.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:32:13 np0005539509 systemd[1]: Starting ovn_metadata_agent container...
Nov 29 01:32:13 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:32:13 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5980339fbd23606fd60482cabb2ffd0e0b84bee0e6bfb5159a1075dd20c3eed/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 29 01:32:13 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5980339fbd23606fd60482cabb2ffd0e0b84bee0e6bfb5159a1075dd20c3eed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:32:13 np0005539509 systemd[1]: Started /usr/bin/podman healthcheck run b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2.
Nov 29 01:32:13 np0005539509 podman[139226]: 2025-11-29 06:32:13.658706122 +0000 UTC m=+0.288601274 container init b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: + sudo -E kolla_set_configs
Nov 29 01:32:13 np0005539509 podman[139226]: 2025-11-29 06:32:13.688980352 +0000 UTC m=+0.318875494 container start b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:32:13 np0005539509 edpm-start-podman-container[139226]: ovn_metadata_agent
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Validating config file
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Copying service configuration files
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Writing out command to execute
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 29 01:32:13 np0005539509 edpm-start-podman-container[139225]: Creating additional drop-in dependency for "ovn_metadata_agent" (b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2)
Nov 29 01:32:13 np0005539509 podman[139248]: 2025-11-29 06:32:13.770974114 +0000 UTC m=+0.053844221 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: ++ cat /run_command
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: + CMD=neutron-ovn-metadata-agent
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: + ARGS=
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: + sudo kolla_copy_cacerts
Nov 29 01:32:13 np0005539509 systemd[1]: Reloading.
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: Running command: 'neutron-ovn-metadata-agent'
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: + [[ ! -n '' ]]
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: + . kolla_extend_start
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: + umask 0022
Nov 29 01:32:13 np0005539509 ovn_metadata_agent[139241]: + exec neutron-ovn-metadata-agent
Nov 29 01:32:13 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:13 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:14 np0005539509 systemd[1]: Started ovn_metadata_agent container.
Nov 29 01:32:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:14.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:15.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:15 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.849 139246 INFO neutron.common.config [-] Logging enabled!#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.849 139246 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.849 139246 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.850 139246 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.851 139246 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.852 139246 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.853 139246 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.854 139246 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.855 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.856 139246 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.857 139246 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.858 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.859 139246 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.860 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.860 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.860 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.860 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.860 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.861 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.861 139246 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.861 139246 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.861 139246 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.861 139246 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.862 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.863 139246 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.864 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.865 139246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.866 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.867 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.868 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.869 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.870 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.871 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.872 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.873 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.874 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.875 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.876 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.877 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.878 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.879 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.880 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.881 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.882 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.883 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.884 139246 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.885 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.886 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.887 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.888 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.889 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.889 139246 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.889 139246 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.898 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.898 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.898 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.899 139246 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.899 139246 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.912 139246 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2fa83236-07b6-4ff7-bb56-9f4f13bed719 (UUID: 2fa83236-07b6-4ff7-bb56-9f4f13bed719) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.938 139246 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.938 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.938 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.938 139246 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.941 139246 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.947 139246 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.953 139246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2fa83236-07b6-4ff7-bb56-9f4f13bed719'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f40cd05d6d0>], external_ids={}, name=2fa83236-07b6-4ff7-bb56-9f4f13bed719, nb_cfg_timestamp=1764397844667, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.955 139246 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f40cd056f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.956 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.956 139246 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.956 139246 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.956 139246 INFO oslo_service.service [-] Starting 1 workers#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.961 139246 DEBUG oslo_service.service [-] Started child 139354 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.965 139246 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpl53aly_m/privsep.sock']#033[00m
Nov 29 01:32:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:15.966 139354 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-168415'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.001 139354 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.002 139354 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.002 139354 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.010 139354 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.018 139354 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.025 139354 INFO eventlet.wsgi.server [-] (139354) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Nov 29 01:32:16 np0005539509 systemd[1]: session-47.scope: Deactivated successfully.
Nov 29 01:32:16 np0005539509 systemd[1]: session-47.scope: Consumed 1min 847ms CPU time.
Nov 29 01:32:16 np0005539509 systemd-logind[785]: Session 47 logged out. Waiting for processes to exit.
Nov 29 01:32:16 np0005539509 systemd-logind[785]: Removed session 47.
Nov 29 01:32:16 np0005539509 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.742 139246 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.742 139246 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpl53aly_m/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.590 139359 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.595 139359 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.597 139359 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.597 139359 INFO oslo.privsep.daemon [-] privsep daemon running as pid 139359#033[00m
Nov 29 01:32:16 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:16.745 139359 DEBUG oslo.privsep.daemon [-] privsep: reply[31ff0470-bc39-4c86-a9ef-d058219f2c8b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:32:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:16.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.277 139359 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.277 139359 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.277 139359 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:32:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:17.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.846 139359 DEBUG oslo.privsep.daemon [-] privsep: reply[18f9256b-8ccd-4fdf-8dd6-478b787b6a3f]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.849 139246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2fa83236-07b6-4ff7-bb56-9f4f13bed719, column=external_ids, values=({'neutron:ovn-metadata-id': '0373c087-79f4-5325-b3bb-60a5df9a729a'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.859 139246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2fa83236-07b6-4ff7-bb56-9f4f13bed719, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.868 139246 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.868 139246 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.868 139246 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.869 139246 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.870 139246 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.871 139246 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.872 139246 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.873 139246 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.874 139246 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.875 139246 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.876 139246 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.877 139246 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.878 139246 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.879 139246 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.880 139246 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.881 139246 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.882 139246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.883 139246 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.884 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.885 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.886 139246 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.887 139246 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.888 139246 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.889 139246 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.890 139246 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.891 139246 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.892 139246 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.893 139246 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.894 139246 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.895 139246 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.896 139246 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.897 139246 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.898 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.899 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.900 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:32:17.901 139246 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:32:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:18.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:19.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:20 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:20.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:21.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:22 np0005539509 systemd-logind[785]: New session 48 of user zuul.
Nov 29 01:32:22 np0005539509 systemd[1]: Started Session 48 of User zuul.
Nov 29 01:32:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:22.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:23 np0005539509 python3.9[139517]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:32:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:23.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:32:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:24.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:32:25 np0005539509 python3.9[139673]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:32:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:25.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:25 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:32:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:32:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:26.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:32:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:32:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:27.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:32:27 np0005539509 python3.9[139837]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:32:27 np0005539509 systemd[1]: Reloading.
Nov 29 01:32:28 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:28 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:28.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:29.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:29 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:32:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).paxos(paxos updating c 503..1046) lease_timeout -- calling new election
Nov 29 01:32:29 np0005539509 ceph-mon[80754]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Nov 29 01:32:29 np0005539509 ceph-mon[80754]: paxos.2).electionLogic(22) init, last seen epoch 22
Nov 29 01:32:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:32:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:32:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:30.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:32:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:31.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:32:32 np0005539509 python3.9[140022]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:32:32 np0005539509 network[140039]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:32:32 np0005539509 network[140040]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:32:32 np0005539509 network[140041]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:32:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:32.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:33.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:33 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:32:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:32:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:34.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:32:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:35.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:35 np0005539509 ceph-mon[80754]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Nov 29 01:32:35 np0005539509 ceph-mon[80754]: paxos.2).electionLogic(26) init, last seen epoch 26
Nov 29 01:32:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:32:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:32:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:32:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:32:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:36.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:37 np0005539509 python3.9[140303]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:37.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:37 np0005539509 python3.9[140456]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:38 np0005539509 podman[140581]: 2025-11-29 06:32:38.631769419 +0000 UTC m=+0.125217645 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:32:38 np0005539509 ceph-mon[80754]: mon.compute-1 calling monitor election
Nov 29 01:32:38 np0005539509 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 01:32:38 np0005539509 ceph-mon[80754]: mon.compute-2 is new leader, mons compute-2,compute-1 in quorum (ranks 1,2)
Nov 29 01:32:38 np0005539509 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 01:32:38 np0005539509 ceph-mon[80754]: overall HEALTH_OK
Nov 29 01:32:38 np0005539509 ceph-mon[80754]: mon.compute-2 calling monitor election
Nov 29 01:32:38 np0005539509 ceph-mon[80754]: mon.compute-1 calling monitor election
Nov 29 01:32:38 np0005539509 ceph-mon[80754]: mon.compute-0 calling monitor election
Nov 29 01:32:38 np0005539509 ceph-mon[80754]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 01:32:38 np0005539509 ceph-mon[80754]: overall HEALTH_OK
Nov 29 01:32:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:38.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:38 np0005539509 python3.9[140629]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:39.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:39 np0005539509 python3.9[140787]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:40 np0005539509 python3.9[140940]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:40.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:41.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:41 np0005539509 python3.9[141093]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:42.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:43 np0005539509 python3.9[141246]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:43.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:44 np0005539509 podman[141347]: 2025-11-29 06:32:44.359309977 +0000 UTC m=+0.081740782 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 01:32:44 np0005539509 python3.9[141418]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:44.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:45.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:45 np0005539509 python3.9[141571]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:46 np0005539509 python3.9[141723]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:46.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:47.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:47 np0005539509 python3.9[141993]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:48 np0005539509 python3.9[142159]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:48.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:49 np0005539509 python3.9[142311]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:49.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:49 np0005539509 python3.9[142463]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:50.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:51.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:52 np0005539509 python3.9[142615]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:52.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:53 np0005539509 python3.9[142767]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:53.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:53 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:53 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:32:53 np0005539509 python3.9[142919]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:54 np0005539509 python3.9[143071]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:54.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:55.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:55 np0005539509 python3.9[143223]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:56 np0005539509 python3.9[143375]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:56 np0005539509 python3.9[143527]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:56.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:57.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:57 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:32:57 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:32:57 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:32:58 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:58 np0005539509 python3.9[143679]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:32:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:32:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:58.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:32:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:32:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:32:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:59.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:00 np0005539509 python3.9[143831]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:33:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:00.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:01 np0005539509 python3.9[143983]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:33:01 np0005539509 systemd[1]: Reloading.
Nov 29 01:33:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:01.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:01 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:33:01 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:33:02 np0005539509 python3.9[144170]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:02.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:03.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:03 np0005539509 python3.9[144323]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:04 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:04 np0005539509 python3.9[144476]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:04.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:05 np0005539509 python3.9[144629]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:05.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:05 np0005539509 python3.9[144782]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:06 np0005539509 python3.9[144935]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:06.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:07 np0005539509 python3.9[145088]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:07.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:08.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:09 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:09 np0005539509 podman[145213]: 2025-11-29 06:33:09.372479916 +0000 UTC m=+0.135485413 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 29 01:33:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:09.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:09 np0005539509 python3.9[145259]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 29 01:33:10 np0005539509 python3.9[145420]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:33:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:10.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:33:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:33:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:11.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:11 np0005539509 python3.9[145628]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:33:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:12.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:13.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:14.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:15 np0005539509 podman[145661]: 2025-11-29 06:33:15.306411434 +0000 UTC m=+0.054754439 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 01:33:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:15.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:33:15.892 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:33:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:33:15.893 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:33:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:33:15.894 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:33:16 np0005539509 python3.9[145808]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:33:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:16.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:17 np0005539509 python3.9[145892]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:33:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:17.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:18.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:19 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:19.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:20.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:21.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:22.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:23.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:24 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:24.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:25.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:26.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:27.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:28.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:29.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:30.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:31.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:32.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:33.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:34 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:34.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:35.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:36.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:37.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:38.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:39 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:39.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:40 np0005539509 podman[146086]: 2025-11-29 06:33:40.444473231 +0000 UTC m=+0.153030676 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:33:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:41.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:41.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:43.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:43.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:44 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:45.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:45.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:46 np0005539509 podman[146115]: 2025-11-29 06:33:46.378374293 +0000 UTC m=+0.108509279 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:33:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:47.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:47.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000052s ======
Nov 29 01:33:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:49.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Nov 29 01:33:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:49.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:51.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:51 np0005539509 kernel: SELinux:  Converting 2769 SID table entries...
Nov 29 01:33:51 np0005539509 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:33:51 np0005539509 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:33:51 np0005539509 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:33:51 np0005539509 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:33:51 np0005539509 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:33:51 np0005539509 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:33:51 np0005539509 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:33:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:51.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:53.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:53.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:55.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:55.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:57.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:57.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:59.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:33:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:59.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:01.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:01.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:01 np0005539509 kernel: SELinux:  Converting 2769 SID table entries...
Nov 29 01:34:01 np0005539509 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:34:01 np0005539509 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:34:01 np0005539509 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:34:01 np0005539509 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:34:01 np0005539509 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:34:01 np0005539509 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:34:01 np0005539509 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:34:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:03.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:03.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:04 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:05.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:05.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:07.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:34:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:07.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:34:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:09.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:09 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:09.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:10 np0005539509 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 29 01:34:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:11.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:11 np0005539509 podman[146151]: 2025-11-29 06:34:11.114541316 +0000 UTC m=+0.101668396 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:34:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:11.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:13.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:34:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:34:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:34:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:13.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:15.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:15.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:34:15.894 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:34:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:34:15.895 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:34:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:34:15.896 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:34:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:17.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:17 np0005539509 podman[147736]: 2025-11-29 06:34:17.325650732 +0000 UTC m=+0.062002758 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 29 01:34:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:17.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:19.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:19 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:19.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:21.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:21.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:23.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:23.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:24 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:25.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:25.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:27.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:27.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:29.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:29 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:29.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:31.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:31.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:33.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:33.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:34 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:35.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:35.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:37.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:37.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:39.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:39 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:39.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:41.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:41 np0005539509 podman[161643]: 2025-11-29 06:34:41.333141989 +0000 UTC m=+0.073070027 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 29 01:34:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:41.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:43.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:43.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:44 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:34:44 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:34:44 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:45.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:45.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:47.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:47.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:47 np0005539509 podman[163192]: 2025-11-29 06:34:47.609924941 +0000 UTC m=+0.051519056 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:34:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:49.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:49.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:51.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:51.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:53.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:53.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:55.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:55.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:55 np0005539509 ceph-mgr[81116]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 01:34:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:57.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:57.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:59.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:34:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:34:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:59.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:34:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:01.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:01.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:02 np0005539509 kernel: SELinux:  Converting 2770 SID table entries...
Nov 29 01:35:02 np0005539509 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:35:02 np0005539509 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:35:02 np0005539509 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:35:02 np0005539509 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:35:02 np0005539509 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:35:02 np0005539509 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:35:02 np0005539509 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:35:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:03.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:03.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:04 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:05.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:05.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:07.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:07.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:09.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:09.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:09 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:10 np0005539509 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 01:35:10 np0005539509 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 29 01:35:10 np0005539509 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Nov 29 01:35:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:11.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:11.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:12 np0005539509 podman[163251]: 2025-11-29 06:35:12.380265062 +0000 UTC m=+0.107428720 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:35:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:13.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:13.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:15.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:15.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:35:15.900 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:35:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:35:15.902 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:35:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:35:15.902 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:35:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000052s ======
Nov 29 01:35:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:17.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Nov 29 01:35:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:17.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:18 np0005539509 podman[163498]: 2025-11-29 06:35:18.330138183 +0000 UTC m=+0.067244920 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 01:35:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:19.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:19.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:19 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:20 np0005539509 systemd[1]: Stopping OpenSSH server daemon...
Nov 29 01:35:20 np0005539509 systemd[1]: sshd.service: Deactivated successfully.
Nov 29 01:35:20 np0005539509 systemd[1]: Stopped OpenSSH server daemon.
Nov 29 01:35:20 np0005539509 systemd[1]: sshd.service: Consumed 6.695s CPU time, read 32.0K from disk, written 184.0K to disk.
Nov 29 01:35:20 np0005539509 systemd[1]: Stopped target sshd-keygen.target.
Nov 29 01:35:20 np0005539509 systemd[1]: Stopping sshd-keygen.target...
Nov 29 01:35:20 np0005539509 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:35:20 np0005539509 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:35:20 np0005539509 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:35:20 np0005539509 systemd[1]: Reached target sshd-keygen.target.
Nov 29 01:35:20 np0005539509 systemd[1]: Starting OpenSSH server daemon...
Nov 29 01:35:20 np0005539509 systemd[1]: Started OpenSSH server daemon.
Nov 29 01:35:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:21.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:21.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:21 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:35:22 np0005539509 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:35:22 np0005539509 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:35:23 np0005539509 systemd[1]: Reloading.
Nov 29 01:35:23 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:35:23 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:35:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:23.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:23 np0005539509 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:35:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:23.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:25.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:25.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:27.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:27.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:29.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:29.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:31 np0005539509 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:35:31 np0005539509 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:35:31 np0005539509 systemd[1]: man-db-cache-update.service: Consumed 10.043s CPU time.
Nov 29 01:35:31 np0005539509 systemd[1]: run-r802f4fe76eb0417b87be99fe7d8cb287.service: Deactivated successfully.
Nov 29 01:35:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:31.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:31.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:33.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:33.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:35.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:35.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:37.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:37.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:39.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:39.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:41.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:41.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:43.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:43 np0005539509 podman[172821]: 2025-11-29 06:35:43.381743552 +0000 UTC m=+0.132987968 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:35:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:43.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:45.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:45.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:47.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:47.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:49.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:49 np0005539509 podman[172955]: 2025-11-29 06:35:49.337624214 +0000 UTC m=+0.072144309 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:35:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:49.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:51.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:51.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:35:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:35:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:53.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:53 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:35:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:53.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:55.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:55.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:57.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:57.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:59.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:35:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:59.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:01.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:01.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:01 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:36:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:03.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:03.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:05.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:05.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:07.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:07.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:09.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:09.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:11.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:11.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:36:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:13.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:36:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:13.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:14 np0005539509 podman[172975]: 2025-11-29 06:36:14.392881587 +0000 UTC m=+0.121730016 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 29 01:36:15 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:15.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:15.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:36:15.902 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:36:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:36:15.903 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:36:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:36:15.903 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:36:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:17.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:17.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:19.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:19.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:20 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:20 np0005539509 podman[173005]: 2025-11-29 06:36:20.335166107 +0000 UTC m=+0.072062158 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 01:36:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:21.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:21.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:23.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:23.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:25.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.004000104s ======
Nov 29 01:36:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:25.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000104s
Nov 29 01:36:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:27.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:27.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:29.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:29 np0005539509 ceph-mds[84384]: mds.beacon.cephfs.compute-1.vlqnad missed beacon ack from the monitors
Nov 29 01:36:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:29.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:31.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:31.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:33.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:33.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:35.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:35.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:37.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:37.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:39.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:39.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:40 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:36:41 np0005539509 python3.9[173152]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:41 np0005539509 systemd[1]: Reloading.
Nov 29 01:36:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:41.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:41 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:41 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:41.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:41 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:36:42 np0005539509 python3.9[173392]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:42 np0005539509 systemd[1]: Reloading.
Nov 29 01:36:42 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:42 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:43.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:43 np0005539509 python3.9[173582]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:43 np0005539509 systemd[1]: Reloading.
Nov 29 01:36:43 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:43 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:43.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:44 np0005539509 podman[173772]: 2025-11-29 06:36:44.619305999 +0000 UTC m=+0.171274901 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:36:44 np0005539509 python3.9[173773]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:44 np0005539509 systemd[1]: Reloading.
Nov 29 01:36:44 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:44 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:45.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:45.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:47.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:47 np0005539509 python3.9[173990]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:47 np0005539509 systemd[1]: Reloading.
Nov 29 01:36:47 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:47 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:47.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:48 np0005539509 python3.9[174180]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:48 np0005539509 systemd[1]: Reloading.
Nov 29 01:36:48 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:48 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:49.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:49.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:49 np0005539509 python3.9[174371]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:50 np0005539509 systemd[1]: Reloading.
Nov 29 01:36:50 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:50 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:50 np0005539509 podman[174409]: 2025-11-29 06:36:50.481142868 +0000 UTC m=+0.072525251 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 01:36:51 np0005539509 python3.9[174580]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:51.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:51.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:52 np0005539509 python3.9[174735]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:52 np0005539509 systemd[1]: Reloading.
Nov 29 01:36:52 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:52 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:53.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:53.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:53 np0005539509 python3.9[174924]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:54 np0005539509 systemd[1]: Reloading.
Nov 29 01:36:54 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:54 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:54 np0005539509 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 29 01:36:54 np0005539509 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 29 01:36:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:55.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:55.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:57 np0005539509 python3.9[175116]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:57.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.473165) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217473425, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2441, "num_deletes": 251, "total_data_size": 6349396, "memory_usage": 6431848, "flush_reason": "Manual Compaction"}
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217517600, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 4155641, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10584, "largest_seqno": 13020, "table_properties": {"data_size": 4145723, "index_size": 6348, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19629, "raw_average_key_size": 20, "raw_value_size": 4125981, "raw_average_value_size": 4210, "num_data_blocks": 284, "num_entries": 980, "num_filter_entries": 980, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397891, "oldest_key_time": 1764397891, "file_creation_time": 1764398217, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 44501 microseconds, and 16931 cpu microseconds.
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.517663) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 4155641 bytes OK
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.517689) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.519870) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.519896) EVENT_LOG_v1 {"time_micros": 1764398217519889, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.519920) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 6338934, prev total WAL file size 6338934, number of live WAL files 2.
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.522453) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(4058KB)], [21(9323KB)]
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217522652, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 13703122, "oldest_snapshot_seqno": -1}
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4464 keys, 10609978 bytes, temperature: kUnknown
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217611716, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 10609978, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10575161, "index_size": 22547, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11205, "raw_key_size": 109199, "raw_average_key_size": 24, "raw_value_size": 10489542, "raw_average_value_size": 2349, "num_data_blocks": 972, "num_entries": 4464, "num_filter_entries": 4464, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398217, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.612042) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 10609978 bytes
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.613653) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.7 rd, 119.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 9.1 +0.0 blob) out(10.1 +0.0 blob), read-write-amplify(5.9) write-amplify(2.6) OK, records in: 4982, records dropped: 518 output_compression: NoCompression
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.613674) EVENT_LOG_v1 {"time_micros": 1764398217613661, "job": 10, "event": "compaction_finished", "compaction_time_micros": 89169, "compaction_time_cpu_micros": 34863, "output_level": 6, "num_output_files": 1, "total_output_size": 10609978, "num_input_records": 4982, "num_output_records": 4464, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217614488, "job": 10, "event": "table_file_deletion", "file_number": 23}
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217616635, "job": 10, "event": "table_file_deletion", "file_number": 21}
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.522321) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.616741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.616749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.616751) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.616753) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:36:57.616755) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:57.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:57 np0005539509 python3.9[175271]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:58 np0005539509 python3.9[175426]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:59.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:59 np0005539509 python3.9[175581]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:36:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:59.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:00 np0005539509 python3.9[175736]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:01.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:01 np0005539509 python3.9[175891]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:01.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:02 np0005539509 python3.9[176046]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:03 np0005539509 python3.9[176201]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:03.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:03.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:04 np0005539509 python3.9[176356]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:04 np0005539509 python3.9[176511]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:05.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:05 np0005539509 python3.9[176666]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:05.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:06 np0005539509 python3.9[176821]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:07.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:07 np0005539509 python3.9[176976]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:07.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:08 np0005539509 python3.9[177131]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:09.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:09.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:11 np0005539509 python3.9[177288]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:11.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:11.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:11 np0005539509 python3.9[177440]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:12 np0005539509 python3.9[177592]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:13 np0005539509 python3.9[177744]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:13.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:13.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:13 np0005539509 python3.9[177896]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:14 np0005539509 python3.9[178048]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:15 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:15.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:15 np0005539509 podman[178148]: 2025-11-29 06:37:15.394397965 +0000 UTC m=+0.129918206 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 29 01:37:15 np0005539509 python3.9[178226]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:15.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:37:15.903 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:37:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:37:15.904 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:37:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:37:15.905 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:37:16 np0005539509 python3.9[178352]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398234.9247022-1631-272444055300555/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:16 np0005539509 python3.9[178506]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:17.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:17 np0005539509 python3.9[178631]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398236.3639603-1631-86036261312485/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:17.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:18 np0005539509 python3.9[178783]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:18 np0005539509 python3.9[178908]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398237.6124299-1631-266298084345523/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:19 np0005539509 python3.9[179060]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:19.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:19.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:19 np0005539509 python3.9[179185]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398238.8239045-1631-178566270045473/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:20 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:20 np0005539509 python3.9[179337]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:21 np0005539509 podman[179434]: 2025-11-29 06:37:21.296622574 +0000 UTC m=+0.092095604 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:37:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:37:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:21.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:37:21 np0005539509 python3.9[179477]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398240.1058025-1631-220718953633725/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:21.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:22 np0005539509 python3.9[179632]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:22 np0005539509 python3.9[179757]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398241.6541767-1631-251541886994568/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:23 np0005539509 python3.9[179909]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:23.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:24.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:25 np0005539509 python3.9[180032]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398242.8746698-1631-227206292202920/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:25.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:25 np0005539509 python3.9[180185]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:26 np0005539509 python3.9[180310]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398245.2563784-1631-76949557507197/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:26.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:27 np0005539509 python3.9[180462]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 29 01:37:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:27.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:28.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:29.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:30.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:31 np0005539509 python3.9[180617]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:31.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:31 np0005539509 python3.9[180769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:32 np0005539509 python3.9[180921]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:32.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:33 np0005539509 python3.9[181073]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:33.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:33 np0005539509 python3.9[181225]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:34 np0005539509 python3.9[181377]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:34.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:35 np0005539509 python3.9[181529]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:35.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:35 np0005539509 python3.9[181681]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:36 np0005539509 python3.9[181833]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:36.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:37 np0005539509 python3.9[181985]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:37.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:37 np0005539509 python3.9[182137]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:38 np0005539509 python3.9[182289]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:38.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:39 np0005539509 python3.9[182441]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:39.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:39 np0005539509 python3.9[182593]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:40.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:41.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:42 np0005539509 podman[182922]: 2025-11-29 06:37:42.042558225 +0000 UTC m=+0.077726469 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default)
Nov 29 01:37:42 np0005539509 python3.9[182921]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:42 np0005539509 podman[182922]: 2025-11-29 06:37:42.150519243 +0000 UTC m=+0.185687457 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Nov 29 01:37:42 np0005539509 python3.9[183153]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398261.6088867-2294-79249116156935/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:42.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:43 np0005539509 python3.9[183320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:43.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:43 np0005539509 python3.9[183555]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398262.894948-2294-2001492489449/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:44 np0005539509 python3.9[183724]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:44.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:45 np0005539509 python3.9[183847]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398264.1167877-2294-131465536107601/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:45 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:37:45 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:37:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:45.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:45 np0005539509 podman[183971]: 2025-11-29 06:37:45.767582501 +0000 UTC m=+0.136953904 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 01:37:45 np0005539509 python3.9[184016]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:46 np0005539509 python3.9[184148]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398265.3699958-2294-47270268184019/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:46.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:47 np0005539509 python3.9[184300]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:47.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:37:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:37:47 np0005539509 python3.9[184423]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398266.6962094-2294-211728434380370/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:48 np0005539509 python3.9[184575]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:48.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:37:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:37:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:37:49 np0005539509 python3.9[184698]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398268.1008565-2294-13675310506462/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:49.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:49 np0005539509 python3.9[184850]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:50 np0005539509 python3.9[184973]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398269.4193218-2294-119258112079153/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:50.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:51 np0005539509 python3.9[185125]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:51.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:51 np0005539509 podman[185220]: 2025-11-29 06:37:51.63602868 +0000 UTC m=+0.069419567 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:37:51 np0005539509 python3.9[185265]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398270.6600578-2294-241243882900888/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:52 np0005539509 python3.9[185417]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:37:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:52.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:37:53 np0005539509 python3.9[185540]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398272.0413673-2294-224655406299635/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:53.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:53 np0005539509 python3.9[185692]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:54 np0005539509 python3.9[185815]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398273.2643209-2294-171697872700200/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:54.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:55.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:55 np0005539509 python3.9[185967]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:56 np0005539509 python3.9[186090]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398274.712713-2294-73325298030304/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:56.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:57 np0005539509 python3.9[186242]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:57.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:57 np0005539509 python3.9[186365]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398276.7182071-2294-162280638203457/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:58 np0005539509 python3.9[186517]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:58.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:59 np0005539509 python3.9[186640]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398278.1190464-2294-37751764207498/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:37:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:59.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:59 np0005539509 python3.9[186792]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:00 np0005539509 python3.9[186915]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398279.3315175-2294-19475005926558/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:00.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:01.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:01 np0005539509 python3.9[187066]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:02.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:03.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:03 np0005539509 python3.9[187221]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 29 01:38:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:04.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:05.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:06.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:07 np0005539509 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 29 01:38:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:07.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:38:08 np0005539509 python3.9[187428]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:08.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:09 np0005539509 python3.9[187580]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:09.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:10 np0005539509 python3.9[187732]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:10 np0005539509 python3.9[187884]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:10.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:38:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:11.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:11 np0005539509 python3.9[188036]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:12 np0005539509 python3.9[188188]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:12 np0005539509 auditd[701]: Audit daemon rotating log files
Nov 29 01:38:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:12.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:13 np0005539509 python3.9[188340]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:13.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:13 np0005539509 python3.9[188492]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:14 np0005539509 python3.9[188644]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:14.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:15 np0005539509 python3.9[188796]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:15.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:15 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:38:15.904 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:38:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:38:15.906 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:38:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:38:15.907 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:38:16 np0005539509 podman[188920]: 2025-11-29 06:38:16.10482321 +0000 UTC m=+0.122306001 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 01:38:16 np0005539509 python3.9[188965]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:38:16 np0005539509 systemd[1]: Reloading.
Nov 29 01:38:16 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:16 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:16 np0005539509 systemd[1]: Starting libvirt logging daemon socket...
Nov 29 01:38:16 np0005539509 systemd[1]: Listening on libvirt logging daemon socket.
Nov 29 01:38:16 np0005539509 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 29 01:38:16 np0005539509 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 29 01:38:16 np0005539509 systemd[1]: Starting libvirt logging daemon...
Nov 29 01:38:16 np0005539509 systemd[1]: Started libvirt logging daemon.
Nov 29 01:38:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:16.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:17.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:17 np0005539509 python3.9[189168]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:38:17 np0005539509 systemd[1]: Reloading.
Nov 29 01:38:17 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:17 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:17 np0005539509 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 29 01:38:17 np0005539509 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 29 01:38:17 np0005539509 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 29 01:38:17 np0005539509 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 29 01:38:17 np0005539509 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 29 01:38:17 np0005539509 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 29 01:38:17 np0005539509 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 01:38:18 np0005539509 systemd[1]: Started libvirt nodedev daemon.
Nov 29 01:38:18 np0005539509 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 29 01:38:18 np0005539509 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 29 01:38:18 np0005539509 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 29 01:38:18 np0005539509 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 29 01:38:18 np0005539509 python3.9[189393]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:38:18 np0005539509 systemd[1]: Reloading.
Nov 29 01:38:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:18.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:19 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:19 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:19 np0005539509 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 29 01:38:19 np0005539509 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 29 01:38:19 np0005539509 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 29 01:38:19 np0005539509 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 29 01:38:19 np0005539509 systemd[1]: Starting libvirt proxy daemon...
Nov 29 01:38:19 np0005539509 systemd[1]: Started libvirt proxy daemon.
Nov 29 01:38:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:19.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:19 np0005539509 setroubleshoot[189232]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l e72e4fab-4474-481d-9f3c-44e6530a45b6
Nov 29 01:38:19 np0005539509 setroubleshoot[189232]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 29 01:38:19 np0005539509 setroubleshoot[189232]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l e72e4fab-4474-481d-9f3c-44e6530a45b6
Nov 29 01:38:19 np0005539509 setroubleshoot[189232]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 29 01:38:20 np0005539509 python3.9[189607]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:38:20 np0005539509 systemd[1]: Reloading.
Nov 29 01:38:20 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:20 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:20 np0005539509 systemd[1]: Listening on libvirt locking daemon socket.
Nov 29 01:38:20 np0005539509 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 29 01:38:20 np0005539509 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 29 01:38:20 np0005539509 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 29 01:38:20 np0005539509 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 29 01:38:20 np0005539509 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 29 01:38:20 np0005539509 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 29 01:38:20 np0005539509 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 29 01:38:20 np0005539509 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 29 01:38:20 np0005539509 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 29 01:38:20 np0005539509 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 01:38:20 np0005539509 systemd[1]: Started libvirt QEMU daemon.
Nov 29 01:38:20 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:20.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:21.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:21 np0005539509 python3.9[189823]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:38:21 np0005539509 systemd[1]: Reloading.
Nov 29 01:38:21 np0005539509 podman[189825]: 2025-11-29 06:38:21.933678317 +0000 UTC m=+0.070404614 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:38:21 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:21 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:22 np0005539509 systemd[1]: Starting libvirt secret daemon socket...
Nov 29 01:38:22 np0005539509 systemd[1]: Listening on libvirt secret daemon socket.
Nov 29 01:38:22 np0005539509 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 29 01:38:22 np0005539509 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 29 01:38:22 np0005539509 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 29 01:38:22 np0005539509 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 29 01:38:22 np0005539509 systemd[1]: Starting libvirt secret daemon...
Nov 29 01:38:22 np0005539509 systemd[1]: Started libvirt secret daemon.
Nov 29 01:38:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:22.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:23.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:24 np0005539509 python3.9[190054]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:24.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:25.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:25 np0005539509 python3.9[190206]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:38:26 np0005539509 python3.9[190358]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:26.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:27.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:28 np0005539509 python3.9[190512]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:38:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:28.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:29 np0005539509 python3.9[190662]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:29.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:29 np0005539509 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 29 01:38:29 np0005539509 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.095s CPU time.
Nov 29 01:38:29 np0005539509 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 29 01:38:30 np0005539509 python3.9[190785]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398308.7800333-3368-16976233971827/.source.xml follow=False _original_basename=secret.xml.j2 checksum=63744b3abb892aaab98ed7226f328ffc66ff66bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:30 np0005539509 python3.9[190937]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 336ec58c-893b-528f-a0c1-6ed1196bc047#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:30.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:31.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:31 np0005539509 python3.9[191099]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:32.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:33.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:34 np0005539509 python3.9[191564]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:35.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:35 np0005539509 python3.9[191716]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:35.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:35 np0005539509 python3.9[191839]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398314.448359-3533-10779246222150/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:38:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6284 writes, 25K keys, 6284 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6284 writes, 1144 syncs, 5.49 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 444 writes, 711 keys, 444 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s#012Interval WAL: 444 writes, 204 syncs, 2.18 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Nov 29 01:38:36 np0005539509 python3.9[191991]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:37.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:37 np0005539509 python3.9[192143]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:37.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:37 np0005539509 python3.9[192221]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:38 np0005539509 python3.9[192373]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:39.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:39 np0005539509 python3.9[192451]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.j8b_lkih recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:39.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:39 np0005539509 python3.9[192603]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:40 np0005539509 python3.9[192681]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:41.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:41 np0005539509 python3.9[192833]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:41.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:42 np0005539509 python3[192986]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:38:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:43.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:43 np0005539509 python3.9[193138]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:43.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:43 np0005539509 python3.9[193216]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:44 np0005539509 python3.9[193368]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:45.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:45 np0005539509 python3.9[193446]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:45.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:46 np0005539509 python3.9[193598]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:46 np0005539509 podman[193603]: 2025-11-29 06:38:46.394990329 +0000 UTC m=+0.119498717 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 01:38:46 np0005539509 python3.9[193702]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:47.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:47.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:47 np0005539509 python3.9[193854]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:48 np0005539509 python3.9[193932]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:49.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:49 np0005539509 python3.9[194084]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:49.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:49 np0005539509 python3.9[194211]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398328.3961663-3908-210462400689457/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:50 np0005539509 python3.9[194363]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:51.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:51 np0005539509 python3.9[194515]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:51.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:52 np0005539509 podman[194672]: 2025-11-29 06:38:52.326391987 +0000 UTC m=+0.079916148 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 01:38:52 np0005539509 python3.9[194673]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:53.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:53 np0005539509 python3.9[194843]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:53.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:54 np0005539509 python3.9[194996]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:38:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:55.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:55 np0005539509 python3.9[195150]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:55.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:55 np0005539509 python3.9[195305]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:56 np0005539509 python3.9[195457]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:57.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:57 np0005539509 python3.9[195580]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398336.1676142-4124-17413112356666/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:57.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:58 np0005539509 python3.9[195732]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:59.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:59 np0005539509 python3.9[195855]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398337.9043186-4169-173428630325823/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:38:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:59.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:59 np0005539509 python3.9[196007]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:00 np0005539509 python3.9[196130]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398339.3179662-4214-44976795114830/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:01.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:01.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:01 np0005539509 python3.9[196282]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:01 np0005539509 systemd[1]: Reloading.
Nov 29 01:39:01 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:01 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:02 np0005539509 systemd[1]: Reached target edpm_libvirt.target.
Nov 29 01:39:02 np0005539509 python3.9[196472]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 01:39:02 np0005539509 systemd[1]: Reloading.
Nov 29 01:39:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:03.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:03 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:03 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:03 np0005539509 systemd[1]: Reloading.
Nov 29 01:39:03 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:03 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:03.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:04 np0005539509 systemd[1]: session-48.scope: Deactivated successfully.
Nov 29 01:39:04 np0005539509 systemd[1]: session-48.scope: Consumed 3min 48.008s CPU time.
Nov 29 01:39:04 np0005539509 systemd-logind[785]: Session 48 logged out. Waiting for processes to exit.
Nov 29 01:39:04 np0005539509 systemd-logind[785]: Removed session 48.
Nov 29 01:39:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:05.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:05.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:07.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:39:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:07.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:39:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:09.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:09 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:39:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:09.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:39:09 np0005539509 systemd-logind[785]: New session 49 of user zuul.
Nov 29 01:39:10 np0005539509 systemd[1]: Started Session 49 of User zuul.
Nov 29 01:39:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:39:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:11.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:39:11 np0005539509 python3.9[196977]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:39:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:11 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:39:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:11.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:39:12 np0005539509 python3.9[197131]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:39:12 np0005539509 network[197148]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:39:12 np0005539509 network[197149]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:39:12 np0005539509 network[197150]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:39:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:13.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:13.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:15.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:15.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:15 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:39:15.906 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:39:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:39:15.909 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:39:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:39:15.909 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:39:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:17.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:17 np0005539509 podman[197318]: 2025-11-29 06:39:17.38269354 +0000 UTC m=+0.112818088 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 01:39:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:17.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:17 np0005539509 python3.9[197448]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:39:18 np0005539509 python3.9[197532]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:39:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:19.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:19.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:20 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:21.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:21.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:21 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:39:21 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 2457 writes, 14K keys, 2457 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.03 MB/s#012Cumulative WAL: 2457 writes, 2457 syncs, 1.00 writes per sync, written: 0.03 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1119 writes, 4827 keys, 1119 commit groups, 1.0 writes per commit group, ingest: 11.96 MB, 0.02 MB/s#012Interval WAL: 1119 writes, 1119 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     45.8      0.37              0.06         5    0.074       0      0       0.0       0.0#012  L6      1/0   10.12 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.2     76.8     65.0      0.58              0.13         4    0.146     17K   1772       0.0       0.0#012 Sum      1/0   10.12 MB   0.0      0.0     0.0      0.0       0.1      0.0       0.0   3.2     47.0     57.5      0.96              0.18         9    0.106     17K   1772       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   4.2    105.3    110.1      0.23              0.09         4    0.057    9504   1038       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     76.8     65.0      0.58              0.13         4    0.146     17K   1772       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     64.6      0.26              0.06         4    0.066       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.017, interval 0.006#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.05 GB write, 0.05 MB/s write, 0.04 GB read, 0.04 MB/s read, 1.0 seconds#012Interval compaction: 0.02 GB write, 0.04 MB/s write, 0.02 GB read, 0.04 MB/s read, 0.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562155f711f0#2 capacity: 304.00 MB usage: 1.50 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 4.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(66,1.32 MB,0.433641%) FilterBlock(9,58.36 KB,0.0187472%) IndexBlock(9,128.73 KB,0.0413543%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 29 01:39:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:23.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:23 np0005539509 podman[197586]: 2025-11-29 06:39:23.311154361 +0000 UTC m=+0.054133629 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 01:39:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:23.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:25.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:25 np0005539509 python3.9[197756]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:25.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:26 np0005539509 python3.9[197908]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:39:26 np0005539509 python3.9[198061]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:27.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:27 np0005539509 python3.9[198213]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:39:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:27.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:28 np0005539509 python3.9[198366]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.072784) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369072853, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1500, "num_deletes": 250, "total_data_size": 3710953, "memory_usage": 3755312, "flush_reason": "Manual Compaction"}
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Nov 29 01:39:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:29.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369129935, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1461897, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13025, "largest_seqno": 14520, "table_properties": {"data_size": 1457009, "index_size": 2284, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12208, "raw_average_key_size": 20, "raw_value_size": 1446470, "raw_average_value_size": 2402, "num_data_blocks": 103, "num_entries": 602, "num_filter_entries": 602, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398218, "oldest_key_time": 1764398218, "file_creation_time": 1764398369, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 57227 microseconds, and 5099 cpu microseconds.
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.130004) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1461897 bytes OK
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.130031) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.154395) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.154441) EVENT_LOG_v1 {"time_micros": 1764398369154431, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.154468) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 3704043, prev total WAL file size 3704043, number of live WAL files 2.
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.156436) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323533' seq:72057594037927935, type:22 .. '6D67727374617400353034' seq:0, type:0; will stop at (end)
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1427KB)], [24(10MB)]
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369156556, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 12071875, "oldest_snapshot_seqno": -1}
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4613 keys, 9170434 bytes, temperature: kUnknown
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369240842, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 9170434, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9137316, "index_size": 20464, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11589, "raw_key_size": 112549, "raw_average_key_size": 24, "raw_value_size": 9051697, "raw_average_value_size": 1962, "num_data_blocks": 883, "num_entries": 4613, "num_filter_entries": 4613, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398369, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.241424) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 9170434 bytes
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.243876) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.7 rd, 108.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 10.1 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(14.5) write-amplify(6.3) OK, records in: 5066, records dropped: 453 output_compression: NoCompression
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.243937) EVENT_LOG_v1 {"time_micros": 1764398369243924, "job": 12, "event": "compaction_finished", "compaction_time_micros": 84617, "compaction_time_cpu_micros": 42828, "output_level": 6, "num_output_files": 1, "total_output_size": 9170434, "num_input_records": 5066, "num_output_records": 4613, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369245165, "job": 12, "event": "table_file_deletion", "file_number": 26}
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369248659, "job": 12, "event": "table_file_deletion", "file_number": 24}
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.156219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.248940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.248948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.248961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.248965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:39:29.248970) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539509 python3.9[198489]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398367.9246383-251-37782226970519/.source.iscsi _original_basename=.iv8wzz1q follow=False checksum=9eacfeea91ec496576ff4a4caacc5e836fc91499 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:29.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:30 np0005539509 python3.9[198641]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:31.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:31 np0005539509 python3.9[198793]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:31.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:32 np0005539509 python3.9[198945]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:32 np0005539509 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 29 01:39:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:33.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:33 np0005539509 python3.9[199101]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:33 np0005539509 systemd[1]: Reloading.
Nov 29 01:39:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:33.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:33 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:33 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:33 np0005539509 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 01:39:34 np0005539509 systemd[1]: Starting Open-iSCSI...
Nov 29 01:39:34 np0005539509 kernel: Loading iSCSI transport class v2.0-870.
Nov 29 01:39:34 np0005539509 systemd[1]: Started Open-iSCSI.
Nov 29 01:39:34 np0005539509 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 29 01:39:34 np0005539509 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 29 01:39:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:35.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:35 np0005539509 python3.9[199307]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:39:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:35.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:35 np0005539509 network[199325]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:39:35 np0005539509 network[199326]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:39:35 np0005539509 network[199327]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:39:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:37.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:37.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:39.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:39.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:41 np0005539509 python3.9[199599]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:39:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:41.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:41.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:43.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:43.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:45.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:45.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:45 np0005539509 python3.9[199751]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 29 01:39:45 np0005539509 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:39:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:45 np0005539509 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:39:47 np0005539509 python3.9[199908]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:47.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:47 np0005539509 podman[200003]: 2025-11-29 06:39:47.649975526 +0000 UTC m=+0.117865423 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller)
Nov 29 01:39:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:47.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:47 np0005539509 python3.9[200051]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398386.3816695-482-211260985094385/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:49.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:49.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:50 np0005539509 python3.9[200210]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:51.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:51.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:53.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:53.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:54 np0005539509 podman[200287]: 2025-11-29 06:39:54.327724221 +0000 UTC m=+0.066172369 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:39:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:39:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:55.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:39:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:55.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:57.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:57 np0005539509 python3.9[200381]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:39:57 np0005539509 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 01:39:57 np0005539509 systemd[1]: Stopped Load Kernel Modules.
Nov 29 01:39:57 np0005539509 systemd[1]: Stopping Load Kernel Modules...
Nov 29 01:39:57 np0005539509 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:39:57 np0005539509 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:39:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:39:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:57.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:39:58 np0005539509 python3.9[200537]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:39:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:59.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:39:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:39:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:39:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:59.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:00 np0005539509 python3.9[200689]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:01.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:01 np0005539509 ceph-mon[80754]: overall HEALTH_OK
Nov 29 01:40:01 np0005539509 python3.9[200841]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:01.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:02 np0005539509 python3.9[200993]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:03.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:03 np0005539509 python3.9[201116]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398401.512431-656-184756769076745/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:03.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:04 np0005539509 python3.9[201268]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:05.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:05 np0005539509 python3.9[201421]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:05.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:06 np0005539509 python3.9[201573]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:07.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:07 np0005539509 python3.9[201725]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:07.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:08 np0005539509 python3.9[201877]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:09.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:09 np0005539509 python3.9[202029]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:09.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:10 np0005539509 python3.9[202181]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:11.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:11 np0005539509 python3.9[202333]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:11.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:12 np0005539509 python3.9[202485]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:12 np0005539509 python3.9[202639]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:13.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:13.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:14 np0005539509 python3.9[202791]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:14 np0005539509 python3.9[202943]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:15.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:15 np0005539509 python3.9[203021]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:15.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:15 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:40:15.907 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:40:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:40:15.909 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:40:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:40:15.909 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:40:16 np0005539509 python3.9[203173]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:16 np0005539509 python3.9[203251]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:17.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:17 np0005539509 python3.9[203403]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:17.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:18 np0005539509 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 29 01:40:18 np0005539509 podman[203478]: 2025-11-29 06:40:18.149962434 +0000 UTC m=+0.089294362 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:40:18 np0005539509 python3.9[203583]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:19 np0005539509 python3.9[203661]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:19.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:19 np0005539509 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 01:40:19 np0005539509 python3.9[203814]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:19.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:20 np0005539509 python3.9[203892]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:20 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:21.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:21 np0005539509 python3.9[204044]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:21 np0005539509 systemd[1]: Reloading.
Nov 29 01:40:21 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:21 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:21.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:22 np0005539509 python3.9[204365]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:23.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:23 np0005539509 python3.9[204443]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:40:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:40:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 01:40:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 01:40:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:40:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:23.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:24 np0005539509 python3.9[204595]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.343467) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424343498, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 734, "num_deletes": 252, "total_data_size": 1398800, "memory_usage": 1425240, "flush_reason": "Manual Compaction"}
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424352568, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 924399, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14525, "largest_seqno": 15254, "table_properties": {"data_size": 920862, "index_size": 1381, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 6914, "raw_average_key_size": 16, "raw_value_size": 913841, "raw_average_value_size": 2191, "num_data_blocks": 63, "num_entries": 417, "num_filter_entries": 417, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398369, "oldest_key_time": 1764398369, "file_creation_time": 1764398424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 9149 microseconds, and 3043 cpu microseconds.
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.352616) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 924399 bytes OK
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.352635) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.353904) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.353917) EVENT_LOG_v1 {"time_micros": 1764398424353914, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.353932) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1394898, prev total WAL file size 1394898, number of live WAL files 2.
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.354612) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323533' seq:0, type:0; will stop at (end)
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(902KB)], [27(8955KB)]
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424354645, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 10094833, "oldest_snapshot_seqno": -1}
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4513 keys, 9525855 bytes, temperature: kUnknown
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424420959, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 9525855, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9492970, "index_size": 20487, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11333, "raw_key_size": 112200, "raw_average_key_size": 24, "raw_value_size": 9408606, "raw_average_value_size": 2084, "num_data_blocks": 864, "num_entries": 4513, "num_filter_entries": 4513, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.421263) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 9525855 bytes
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.423057) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.8 rd, 143.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 8.7 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(21.2) write-amplify(10.3) OK, records in: 5030, records dropped: 517 output_compression: NoCompression
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.423076) EVENT_LOG_v1 {"time_micros": 1764398424423068, "job": 14, "event": "compaction_finished", "compaction_time_micros": 66491, "compaction_time_cpu_micros": 18021, "output_level": 6, "num_output_files": 1, "total_output_size": 9525855, "num_input_records": 5030, "num_output_records": 4513, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424423609, "job": 14, "event": "table_file_deletion", "file_number": 29}
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424424997, "job": 14, "event": "table_file_deletion", "file_number": 27}
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.354502) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.425152) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.425158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.425159) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.425162) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:40:24.425164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539509 podman[204646]: 2025-11-29 06:40:24.711274945 +0000 UTC m=+0.069641074 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 01:40:24 np0005539509 python3.9[204691]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:25.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:40:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:40:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:25.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:26 np0005539509 python3.9[204846]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:26 np0005539509 systemd[1]: Reloading.
Nov 29 01:40:26 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:26 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:26 np0005539509 systemd[1]: Starting Create netns directory...
Nov 29 01:40:26 np0005539509 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:40:26 np0005539509 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:40:26 np0005539509 systemd[1]: Finished Create netns directory.
Nov 29 01:40:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:27.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:27 np0005539509 python3.9[205039]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:40:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:27.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:40:28 np0005539509 python3.9[205191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:29.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:29 np0005539509 python3.9[205314]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398428.0534494-1277-225374224493949/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:29.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:30 np0005539509 python3.9[205466]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:31.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:31 np0005539509 python3.9[205618]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:31.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:32 np0005539509 python3.9[205741]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398431.1533241-1352-221477454993418/.source.json _original_basename=.ey95y30w follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:33 np0005539509 python3.9[205893]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:33.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:33 np0005539509 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 29 01:40:33 np0005539509 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 29 01:40:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:33.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:35.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:35.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:37 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:40:37 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:40:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:37.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:37 np0005539509 python3.9[206372]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 29 01:40:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:37.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:38 np0005539509 python3.9[206524]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:40:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:39.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:39 np0005539509 python3.9[206676]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:40:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:39.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:41.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:41.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:42 np0005539509 python3[206858]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:40:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:43.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:43 np0005539509 podman[206872]: 2025-11-29 06:40:43.280639091 +0000 UTC m=+1.134216846 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:40:43 np0005539509 podman[206929]: 2025-11-29 06:40:43.464548999 +0000 UTC m=+0.065207751 container create 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 01:40:43 np0005539509 podman[206929]: 2025-11-29 06:40:43.438142566 +0000 UTC m=+0.038801338 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:40:43 np0005539509 python3[206858]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:40:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:43.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:45.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:45.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:46 np0005539509 python3.9[207120]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:47.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:47 np0005539509 python3.9[207274]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:47.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:47 np0005539509 python3.9[207350]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:48 np0005539509 podman[207403]: 2025-11-29 06:40:48.3801898 +0000 UTC m=+0.119409258 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:40:48 np0005539509 python3.9[207528]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398448.067343-1616-179385087069921/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:49.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:49 np0005539509 python3.9[207604]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:40:49 np0005539509 systemd[1]: Reloading.
Nov 29 01:40:49 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:49 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:49.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:50 np0005539509 python3.9[207717]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:50 np0005539509 systemd[1]: Reloading.
Nov 29 01:40:50 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:50 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:50 np0005539509 systemd[1]: Starting multipathd container...
Nov 29 01:40:51 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:40:51 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03113e0c60b3891b666198972292ad578a22a07280e6f874c53ef5a9916475af/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:40:51 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03113e0c60b3891b666198972292ad578a22a07280e6f874c53ef5a9916475af/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:40:51 np0005539509 systemd[1]: Started /usr/bin/podman healthcheck run 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834.
Nov 29 01:40:51 np0005539509 podman[207758]: 2025-11-29 06:40:51.175233906 +0000 UTC m=+0.175082274 container init 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:40:51 np0005539509 multipathd[207773]: + sudo -E kolla_set_configs
Nov 29 01:40:51 np0005539509 podman[207758]: 2025-11-29 06:40:51.207461422 +0000 UTC m=+0.207309740 container start 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 01:40:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:51.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:51 np0005539509 podman[207758]: multipathd
Nov 29 01:40:51 np0005539509 systemd[1]: Started multipathd container.
Nov 29 01:40:51 np0005539509 multipathd[207773]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:40:51 np0005539509 multipathd[207773]: INFO:__main__:Validating config file
Nov 29 01:40:51 np0005539509 multipathd[207773]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:40:51 np0005539509 multipathd[207773]: INFO:__main__:Writing out command to execute
Nov 29 01:40:51 np0005539509 multipathd[207773]: ++ cat /run_command
Nov 29 01:40:51 np0005539509 multipathd[207773]: + CMD='/usr/sbin/multipathd -d'
Nov 29 01:40:51 np0005539509 multipathd[207773]: + ARGS=
Nov 29 01:40:51 np0005539509 multipathd[207773]: + sudo kolla_copy_cacerts
Nov 29 01:40:51 np0005539509 podman[207779]: 2025-11-29 06:40:51.307491431 +0000 UTC m=+0.078067420 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd)
Nov 29 01:40:51 np0005539509 systemd[1]: 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-50362cb9e60ec73a.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:40:51 np0005539509 systemd[1]: 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-50362cb9e60ec73a.service: Failed with result 'exit-code'.
Nov 29 01:40:51 np0005539509 multipathd[207773]: + [[ ! -n '' ]]
Nov 29 01:40:51 np0005539509 multipathd[207773]: + . kolla_extend_start
Nov 29 01:40:51 np0005539509 multipathd[207773]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 01:40:51 np0005539509 multipathd[207773]: Running command: '/usr/sbin/multipathd -d'
Nov 29 01:40:51 np0005539509 multipathd[207773]: + umask 0022
Nov 29 01:40:51 np0005539509 multipathd[207773]: + exec /usr/sbin/multipathd -d
Nov 29 01:40:51 np0005539509 multipathd[207773]: 3915.815641 | --------start up--------
Nov 29 01:40:51 np0005539509 multipathd[207773]: 3915.815663 | read /etc/multipath.conf
Nov 29 01:40:51 np0005539509 multipathd[207773]: 3915.824181 | path checkers start up
Nov 29 01:40:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:51.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:52 np0005539509 python3.9[207963]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:53.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:53 np0005539509 python3.9[208117]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:53.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:54 np0005539509 python3.9[208282]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:40:54 np0005539509 systemd[1]: Stopping multipathd container...
Nov 29 01:40:55 np0005539509 podman[208284]: 2025-11-29 06:40:55.028055115 +0000 UTC m=+0.072452953 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:40:55 np0005539509 multipathd[207773]: 3919.524102 | exit (signal)
Nov 29 01:40:55 np0005539509 multipathd[207773]: 3919.524183 | --------shut down-------
Nov 29 01:40:55 np0005539509 systemd[1]: libpod-28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834.scope: Deactivated successfully.
Nov 29 01:40:55 np0005539509 podman[208293]: 2025-11-29 06:40:55.084501083 +0000 UTC m=+0.085399933 container died 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 29 01:40:55 np0005539509 systemd[1]: 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-50362cb9e60ec73a.timer: Deactivated successfully.
Nov 29 01:40:55 np0005539509 systemd[1]: Stopped /usr/bin/podman healthcheck run 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834.
Nov 29 01:40:55 np0005539509 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-userdata-shm.mount: Deactivated successfully.
Nov 29 01:40:55 np0005539509 systemd[1]: var-lib-containers-storage-overlay-03113e0c60b3891b666198972292ad578a22a07280e6f874c53ef5a9916475af-merged.mount: Deactivated successfully.
Nov 29 01:40:55 np0005539509 podman[208293]: 2025-11-29 06:40:55.140663113 +0000 UTC m=+0.141561903 container cleanup 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 29 01:40:55 np0005539509 podman[208293]: multipathd
Nov 29 01:40:55 np0005539509 podman[208337]: multipathd
Nov 29 01:40:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:55.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:55 np0005539509 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 29 01:40:55 np0005539509 systemd[1]: Stopped multipathd container.
Nov 29 01:40:55 np0005539509 systemd[1]: Starting multipathd container...
Nov 29 01:40:55 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:40:55 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03113e0c60b3891b666198972292ad578a22a07280e6f874c53ef5a9916475af/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:40:55 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03113e0c60b3891b666198972292ad578a22a07280e6f874c53ef5a9916475af/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:40:55 np0005539509 systemd[1]: Started /usr/bin/podman healthcheck run 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834.
Nov 29 01:40:55 np0005539509 podman[208349]: 2025-11-29 06:40:55.371728041 +0000 UTC m=+0.125596599 container init 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:40:55 np0005539509 multipathd[208365]: + sudo -E kolla_set_configs
Nov 29 01:40:55 np0005539509 podman[208349]: 2025-11-29 06:40:55.40190367 +0000 UTC m=+0.155772198 container start 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:40:55 np0005539509 podman[208349]: multipathd
Nov 29 01:40:55 np0005539509 systemd[1]: Started multipathd container.
Nov 29 01:40:55 np0005539509 multipathd[208365]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:40:55 np0005539509 multipathd[208365]: INFO:__main__:Validating config file
Nov 29 01:40:55 np0005539509 multipathd[208365]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:40:55 np0005539509 multipathd[208365]: INFO:__main__:Writing out command to execute
Nov 29 01:40:55 np0005539509 multipathd[208365]: ++ cat /run_command
Nov 29 01:40:55 np0005539509 multipathd[208365]: + CMD='/usr/sbin/multipathd -d'
Nov 29 01:40:55 np0005539509 multipathd[208365]: + ARGS=
Nov 29 01:40:55 np0005539509 multipathd[208365]: + sudo kolla_copy_cacerts
Nov 29 01:40:55 np0005539509 multipathd[208365]: + [[ ! -n '' ]]
Nov 29 01:40:55 np0005539509 multipathd[208365]: + . kolla_extend_start
Nov 29 01:40:55 np0005539509 multipathd[208365]: Running command: '/usr/sbin/multipathd -d'
Nov 29 01:40:55 np0005539509 multipathd[208365]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 01:40:55 np0005539509 multipathd[208365]: + umask 0022
Nov 29 01:40:55 np0005539509 multipathd[208365]: + exec /usr/sbin/multipathd -d
Nov 29 01:40:55 np0005539509 podman[208372]: 2025-11-29 06:40:55.484558005 +0000 UTC m=+0.068474233 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:40:55 np0005539509 systemd[1]: 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-3a5a6556aad8b51f.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:40:55 np0005539509 systemd[1]: 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834-3a5a6556aad8b51f.service: Failed with result 'exit-code'.
Nov 29 01:40:55 np0005539509 multipathd[208365]: 3919.974870 | --------start up--------
Nov 29 01:40:55 np0005539509 multipathd[208365]: 3919.974902 | read /etc/multipath.conf
Nov 29 01:40:55 np0005539509 multipathd[208365]: 3919.982838 | path checkers start up
Nov 29 01:40:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:55.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:57.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:57.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:58 np0005539509 python3.9[208554]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:59.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:40:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:59.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:00 np0005539509 python3.9[208706]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:41:00 np0005539509 python3.9[208858]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 29 01:41:00 np0005539509 kernel: Key type psk registered
Nov 29 01:41:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:01.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:01 np0005539509 python3.9[209019]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:01.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:02 np0005539509 python3.9[209142]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398461.1928601-1856-281339305287973/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:03.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:03 np0005539509 python3.9[209294]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:03.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:04 np0005539509 python3.9[209446]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:41:04 np0005539509 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 01:41:04 np0005539509 systemd[1]: Stopped Load Kernel Modules.
Nov 29 01:41:04 np0005539509 systemd[1]: Stopping Load Kernel Modules...
Nov 29 01:41:04 np0005539509 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:41:04 np0005539509 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:41:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:05.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:05 np0005539509 python3.9[209602]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:41:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:05.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:07.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:07 np0005539509 systemd[1]: Reloading.
Nov 29 01:41:07 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:07 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:07.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:08 np0005539509 systemd[1]: Reloading.
Nov 29 01:41:08 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:08 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:08 np0005539509 systemd-logind[785]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 01:41:08 np0005539509 systemd-logind[785]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 01:41:08 np0005539509 lvm[209715]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 01:41:08 np0005539509 lvm[209715]: VG ceph_vg0 finished
Nov 29 01:41:08 np0005539509 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:41:08 np0005539509 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:41:08 np0005539509 systemd[1]: Reloading.
Nov 29 01:41:08 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:08 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:09 np0005539509 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:41:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:09.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:09.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:10 np0005539509 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:41:10 np0005539509 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:41:10 np0005539509 systemd[1]: man-db-cache-update.service: Consumed 1.819s CPU time.
Nov 29 01:41:10 np0005539509 systemd[1]: run-re191bf4652a44b02bd4ce636969ccf72.service: Deactivated successfully.
Nov 29 01:41:10 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:11.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:11.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:13.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:13.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:13 np0005539509 python3.9[211056]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:41:14 np0005539509 systemd[1]: Stopping Open-iSCSI...
Nov 29 01:41:14 np0005539509 iscsid[199143]: iscsid shutting down.
Nov 29 01:41:14 np0005539509 systemd[1]: iscsid.service: Deactivated successfully.
Nov 29 01:41:14 np0005539509 systemd[1]: Stopped Open-iSCSI.
Nov 29 01:41:14 np0005539509 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 01:41:14 np0005539509 systemd[1]: Starting Open-iSCSI...
Nov 29 01:41:14 np0005539509 systemd[1]: Started Open-iSCSI.
Nov 29 01:41:15 np0005539509 python3.9[211211]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:41:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:15.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:41:15.908 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:41:15.910 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:41:15.910 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:15.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:15 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:16 np0005539509 python3.9[211367]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:17.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:17 np0005539509 python3.9[211519]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:41:17 np0005539509 systemd[1]: Reloading.
Nov 29 01:41:17 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:17 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:17.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:18 np0005539509 podman[211678]: 2025-11-29 06:41:18.539139976 +0000 UTC m=+0.100078031 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 01:41:18 np0005539509 python3.9[211721]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:41:18 np0005539509 network[211748]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:41:18 np0005539509 network[211749]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:41:18 np0005539509 network[211750]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:41:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:19.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:19.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:20 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:21.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:21.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:23.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:23.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:23 np0005539509 python3.9[212025]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:24 np0005539509 python3.9[212178]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:25.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:25 np0005539509 podman[212300]: 2025-11-29 06:41:25.33350669 +0000 UTC m=+0.058173356 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:41:25 np0005539509 python3.9[212350]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:25 np0005539509 podman[212352]: 2025-11-29 06:41:25.741707739 +0000 UTC m=+0.089142287 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd)
Nov 29 01:41:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:25.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:26 np0005539509 python3.9[212524]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:27 np0005539509 python3.9[212677]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:27.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:27.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:28 np0005539509 python3.9[212830]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:28 np0005539509 python3.9[212983]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:29.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:29 np0005539509 python3.9[213136]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:41:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:29.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:41:30 np0005539509 python3.9[213289]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:30 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:31.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:31 np0005539509 python3.9[213441]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:31.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:32 np0005539509 python3.9[213593]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:32 np0005539509 python3.9[213745]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000055s ======
Nov 29 01:41:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:33.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Nov 29 01:41:33 np0005539509 python3.9[213897]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:33.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:34 np0005539509 python3.9[214049]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:34 np0005539509 python3.9[214201]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:35.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:35 np0005539509 python3.9[214353]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:35.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:35 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:37.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:37 np0005539509 python3.9[214636]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:37.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:38 np0005539509 python3.9[214788]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:38 np0005539509 python3.9[214940]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:39.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:39 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:41:39 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:41:39 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 01:41:39 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:41:39 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:41:39 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:41:39 np0005539509 python3.9[215092]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:39.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:40 np0005539509 python3.9[215244]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:40 np0005539509 python3.9[215396]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:41.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:41 np0005539509 python3.9[215548]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:41.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:42 np0005539509 python3.9[215700]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:43.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:43 np0005539509 python3.9[215852]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:43.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:44 np0005539509 python3.9[216004]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:41:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:45.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:45 np0005539509 python3.9[216156]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:41:45 np0005539509 systemd[1]: Reloading.
Nov 29 01:41:45 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:45 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:45.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:46 np0005539509 python3.9[216343]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:47.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:47 np0005539509 python3.9[216548]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:41:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:41:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:47.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:48 np0005539509 python3.9[216701]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:48 np0005539509 python3.9[216854]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:48 np0005539509 podman[216856]: 2025-11-29 06:41:48.911129669 +0000 UTC m=+0.158417921 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:41:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:49.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:49 np0005539509 python3.9[217033]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:49.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:50 np0005539509 python3.9[217186]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:50 np0005539509 python3.9[217339]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:51.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:51 np0005539509 python3.9[217492]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:51.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:53.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:53.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:54 np0005539509 python3.9[217645]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:55.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:55 np0005539509 python3.9[217797]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:55 np0005539509 podman[217798]: 2025-11-29 06:41:55.544719489 +0000 UTC m=+0.065554442 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:41:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:55.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:56 np0005539509 podman[217940]: 2025-11-29 06:41:56.005530888 +0000 UTC m=+0.079924411 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:41:56 np0005539509 python3.9[217988]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:56 np0005539509 python3.9[218140]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:57.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:57 np0005539509 python3.9[218292]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:57.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:58 np0005539509 python3.9[218444]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:58 np0005539509 python3.9[218596]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:59.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:59 np0005539509 python3.9[218748]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:41:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:59.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:00 np0005539509 python3.9[218900]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:00 np0005539509 python3.9[219052]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:01.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:01.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:03.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:03.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:05.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:05.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:05 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:07.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:08.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:09.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:10.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:11 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:11.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:12.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:13.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:14.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:14 np0005539509 python3.9[219204]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 29 01:42:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:15.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:15 np0005539509 python3.9[219357]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:42:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:42:15.910 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:42:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:42:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:42:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:42:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:42:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:16.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:16 np0005539509 python3.9[219515]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:42:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:17.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:18.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:18 np0005539509 systemd-logind[785]: New session 50 of user zuul.
Nov 29 01:42:18 np0005539509 systemd[1]: Started Session 50 of User zuul.
Nov 29 01:42:18 np0005539509 systemd[1]: session-50.scope: Deactivated successfully.
Nov 29 01:42:18 np0005539509 systemd-logind[785]: Session 50 logged out. Waiting for processes to exit.
Nov 29 01:42:18 np0005539509 systemd-logind[785]: Removed session 50.
Nov 29 01:42:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:19.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:19 np0005539509 podman[219645]: 2025-11-29 06:42:19.367500633 +0000 UTC m=+0.110348870 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:42:19 np0005539509 python3.9[219727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:20.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:20 np0005539509 python3.9[219849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398539.0266728-3439-270502854871527/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:20 np0005539509 python3.9[219999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:21 np0005539509 python3.9[220075]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:21.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:22.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:22 np0005539509 python3.9[220225]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:22 np0005539509 python3.9[220346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398541.406909-3439-55184346923556/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:23.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:23 np0005539509 python3.9[220496]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:24 np0005539509 python3.9[220617]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398542.9289002-3439-204447158189704/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:24.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:24 np0005539509 python3.9[220767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:25 np0005539509 python3.9[220888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398544.182562-3439-205740641074297/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:25.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:25 np0005539509 podman[221012]: 2025-11-29 06:42:25.743266266 +0000 UTC m=+0.062418042 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:42:25 np0005539509 python3.9[221051]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:26.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:26 np0005539509 podman[221150]: 2025-11-29 06:42:26.313490225 +0000 UTC m=+0.054339068 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 29 01:42:26 np0005539509 python3.9[221192]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398545.4034517-3439-196084505826102/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:27.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:27 np0005539509 python3.9[221348]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:28.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:28 np0005539509 python3.9[221500]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:29 np0005539509 python3.9[221652]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:29.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:30 np0005539509 python3.9[221804]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:30.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:30 np0005539509 python3.9[221927]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764398549.4923234-3761-199268457231891/.source _original_basename=.5kp900w0 follow=False checksum=4f791796328ccd9f4aee7287aaf8210be2f93bd0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 29 01:42:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:31.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:31 np0005539509 python3.9[222079]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:32.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:32 np0005539509 python3.9[222231]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:33 np0005539509 python3.9[222352]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398551.8813589-3838-180717272968298/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:33.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:33 np0005539509 python3.9[222502]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:34.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:34 np0005539509 python3.9[222623]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398553.2543547-3883-41171828568607/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:35.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:35 np0005539509 python3.9[222775]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 29 01:42:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:36.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:36 np0005539509 python3.9[222927]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:42:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:37.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:37 np0005539509 python3[223079]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:42:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:38.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:39.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:40.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:41.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:42.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:43.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:44.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:45.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:46.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:47.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:48.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:49 np0005539509 podman[223096]: 2025-11-29 06:42:49.347249554 +0000 UTC m=+11.803139062 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:42:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:42:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:49.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:42:49 np0005539509 podman[223294]: 2025-11-29 06:42:49.504559895 +0000 UTC m=+0.049596065 container create 3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:42:49 np0005539509 podman[223294]: 2025-11-29 06:42:49.473990189 +0000 UTC m=+0.019026379 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:42:49 np0005539509 python3[223079]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 29 01:42:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:50.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:50 np0005539509 podman[223369]: 2025-11-29 06:42:50.413388952 +0000 UTC m=+0.148976512 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 01:42:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:51.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:51 np0005539509 python3.9[223522]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:52.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:52 np0005539509 python3.9[223676]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 29 01:42:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:53.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:54 np0005539509 python3.9[223830]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:42:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:54.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:54 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:42:54 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:42:55 np0005539509 python3[223982]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:42:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:55.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:55 np0005539509 podman[224020]: 2025-11-29 06:42:55.43103818 +0000 UTC m=+0.027957346 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:42:55 np0005539509 podman[224020]: 2025-11-29 06:42:55.626125988 +0000 UTC m=+0.223045134 container create 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:42:55 np0005539509 python3[223982]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 29 01:42:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:56.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:56 np0005539509 podman[224083]: 2025-11-29 06:42:56.316160809 +0000 UTC m=+0.053372961 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 01:42:56 np0005539509 podman[224102]: 2025-11-29 06:42:56.421127629 +0000 UTC m=+0.072993015 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:42:56 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:42:57 np0005539509 python3.9[224251]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:57.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:58.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:58 np0005539509 python3.9[224405]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:58 np0005539509 python3.9[224558]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398578.319969-4159-27554632204550/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:42:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:59.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:59 np0005539509 python3.9[224634]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:42:59 np0005539509 systemd[1]: Reloading.
Nov 29 01:42:59 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:42:59 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:00.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:00 np0005539509 python3.9[224746]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:43:00 np0005539509 systemd[1]: Reloading.
Nov 29 01:43:00 np0005539509 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:00 np0005539509 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:43:00 np0005539509 systemd[1]: Starting nova_compute container...
Nov 29 01:43:01 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:43:01 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:01 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:01 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:01 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:01 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:01 np0005539509 podman[224787]: 2025-11-29 06:43:01.017983852 +0000 UTC m=+0.098440710 container init 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:43:01 np0005539509 podman[224787]: 2025-11-29 06:43:01.029137351 +0000 UTC m=+0.109594209 container start 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 01:43:01 np0005539509 podman[224787]: nova_compute
Nov 29 01:43:01 np0005539509 nova_compute[224801]: + sudo -E kolla_set_configs
Nov 29 01:43:01 np0005539509 systemd[1]: Started nova_compute container.
Nov 29 01:43:01 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Validating config file
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying service configuration files
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Deleting /etc/ceph
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Creating directory /etc/ceph
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Writing out command to execute
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:01 np0005539509 nova_compute[224801]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:43:01 np0005539509 nova_compute[224801]: ++ cat /run_command
Nov 29 01:43:01 np0005539509 nova_compute[224801]: + CMD=nova-compute
Nov 29 01:43:01 np0005539509 nova_compute[224801]: + ARGS=
Nov 29 01:43:01 np0005539509 nova_compute[224801]: + sudo kolla_copy_cacerts
Nov 29 01:43:01 np0005539509 nova_compute[224801]: + [[ ! -n '' ]]
Nov 29 01:43:01 np0005539509 nova_compute[224801]: + . kolla_extend_start
Nov 29 01:43:01 np0005539509 nova_compute[224801]: Running command: 'nova-compute'
Nov 29 01:43:01 np0005539509 nova_compute[224801]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 01:43:01 np0005539509 nova_compute[224801]: + umask 0022
Nov 29 01:43:01 np0005539509 nova_compute[224801]: + exec nova-compute
Nov 29 01:43:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:01.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:43:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:02.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:43:02 np0005539509 python3.9[224963]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:43:03 np0005539509 nova_compute[224801]: 2025-11-29 06:43:03.273 224805 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:03 np0005539509 nova_compute[224801]: 2025-11-29 06:43:03.273 224805 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:03 np0005539509 nova_compute[224801]: 2025-11-29 06:43:03.274 224805 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:03 np0005539509 nova_compute[224801]: 2025-11-29 06:43:03.274 224805 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 29 01:43:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:03.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:03 np0005539509 nova_compute[224801]: 2025-11-29 06:43:03.452 224805 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:43:03 np0005539509 nova_compute[224801]: 2025-11-29 06:43:03.470 224805 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:43:03 np0005539509 nova_compute[224801]: 2025-11-29 06:43:03.471 224805 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 01:43:03 np0005539509 python3.9[225167]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:43:03 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:43:03 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:43:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:04.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.256 224805 INFO nova.virt.driver [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.426 224805 INFO nova.compute.provider_config [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.441 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.442 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.442 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.443 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.443 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.443 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.443 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.444 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.444 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.444 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.444 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.444 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.445 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.445 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.445 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.445 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.445 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.446 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.446 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.446 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.446 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.447 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.447 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.447 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.447 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.447 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.448 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.448 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.448 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.448 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.448 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.449 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.449 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.449 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.449 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.450 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.450 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.450 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.450 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.450 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.451 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.451 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.451 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.451 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.451 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.452 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.452 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.452 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.452 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.453 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.453 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.453 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.453 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.454 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.454 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.454 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.454 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.455 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.455 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.455 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.455 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.455 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.456 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.456 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.456 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.456 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.456 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.457 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.457 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.457 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.457 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.457 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.458 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.458 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.458 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.458 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.458 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.459 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.459 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.459 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.459 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.459 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.460 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.460 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.460 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.460 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.461 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.461 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.461 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.462 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.462 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.462 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.463 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.463 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.463 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.464 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.464 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.464 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.464 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.465 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.465 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.465 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.466 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.466 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.466 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.466 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.467 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.467 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.467 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.467 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.468 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.468 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.468 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.468 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.469 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.469 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.469 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.469 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.470 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.470 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.470 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.471 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.471 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.471 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.472 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.472 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.472 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.472 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.472 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.473 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.473 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.473 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.473 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.473 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.474 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.474 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.474 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.474 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.474 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.475 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.475 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.475 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.475 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.476 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.476 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.476 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.476 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.476 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.477 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.477 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.477 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.477 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.478 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.478 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.478 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.478 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.479 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.479 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.479 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.479 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.479 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.480 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.480 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.480 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.480 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.481 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.481 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.481 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.481 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.481 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.482 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.482 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.482 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.482 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.482 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.483 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.483 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.483 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.483 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.483 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.484 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.484 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.484 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.484 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.485 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.485 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.485 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.485 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.485 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.486 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.486 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.486 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.486 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.487 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.487 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.487 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.487 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.488 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.488 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.488 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.488 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.488 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.489 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.489 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.489 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.489 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.489 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.490 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.490 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.490 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.490 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.491 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.491 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.491 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.491 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.491 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.492 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.492 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.492 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.492 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.492 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.493 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.493 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.493 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.493 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.494 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.494 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.494 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.494 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.494 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.495 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.495 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.495 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.495 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.496 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.496 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.496 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.496 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.496 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.497 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.497 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.497 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.497 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.497 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.498 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.498 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.498 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.498 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.499 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.499 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.499 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.499 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.499 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.500 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.500 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.500 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.500 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.501 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.501 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.501 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.501 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.502 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.502 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.502 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.502 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.502 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.503 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.503 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.503 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.503 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.503 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.504 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.504 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.504 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.504 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.504 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.505 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.505 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.505 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.505 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.506 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.506 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.506 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.506 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.506 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.507 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.507 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.507 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.507 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.508 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.508 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.508 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.508 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.509 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.509 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.509 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.509 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.509 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.510 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.510 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.510 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.510 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.510 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.511 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.511 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.511 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.511 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.512 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.512 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.512 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.512 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.512 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.513 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.513 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.513 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.513 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.514 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.514 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.514 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.514 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.514 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.515 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.515 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.515 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.515 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.516 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.516 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.516 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.516 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.516 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.517 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.517 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.517 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.517 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.517 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.518 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.518 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.518 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.518 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.519 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.519 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.519 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.519 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.519 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.520 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.520 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.520 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.520 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.520 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.521 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.521 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.521 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.521 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.522 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.522 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.522 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.522 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.522 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.523 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.523 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.523 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.524 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.524 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.524 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.524 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.525 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.525 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.525 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.525 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.525 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.526 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.526 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.526 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.526 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.526 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.527 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.527 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.527 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.527 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.527 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.528 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.528 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.528 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.528 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.528 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.529 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.529 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.529 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.529 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.529 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.530 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.530 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.530 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.530 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.531 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.531 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.531 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.531 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.531 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.532 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.532 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.532 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.532 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.532 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.533 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.533 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.533 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.533 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.534 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.534 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.534 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.534 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.534 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.535 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.535 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.535 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.535 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.535 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.536 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.536 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.536 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.536 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.537 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.537 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.537 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.537 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.537 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.538 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.538 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.538 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.538 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.538 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.539 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.539 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.539 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.539 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.539 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.540 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.540 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.540 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.540 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.541 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.541 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.541 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.541 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.541 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.542 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.542 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.542 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.542 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.542 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.543 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.543 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.543 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.543 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.544 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.544 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.544 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.544 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.544 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.545 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.545 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.545 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.545 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.546 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.546 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.546 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.546 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.546 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.547 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.547 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.547 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.547 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.547 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.548 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.548 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.548 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.548 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.548 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.549 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.549 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.549 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.549 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.550 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.550 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.550 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.550 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.550 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.551 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.551 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.551 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.551 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.551 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.552 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.552 224805 WARNING oslo_config.cfg [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 01:43:04 np0005539509 nova_compute[224801]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 01:43:04 np0005539509 nova_compute[224801]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 01:43:04 np0005539509 nova_compute[224801]: and ``live_migration_inbound_addr`` respectively.
Nov 29 01:43:04 np0005539509 nova_compute[224801]: ).  Its value may be silently ignored in the future.#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.552 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.553 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.553 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.553 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.553 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.554 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.554 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.554 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.554 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.555 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.555 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.555 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.555 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.555 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.556 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.556 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.556 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.556 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.557 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rbd_secret_uuid        = 336ec58c-893b-528f-a0c1-6ed1196bc047 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.557 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.557 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.557 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.557 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.558 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.558 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.558 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.558 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.559 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.559 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.559 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.559 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.560 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.560 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.560 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.560 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.561 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.561 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.561 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.561 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.561 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.562 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.562 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.562 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.562 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.562 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.563 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.563 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.563 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.563 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.564 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.564 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.564 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.564 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.564 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.565 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.565 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.565 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.565 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.565 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.566 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.566 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.566 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.566 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.566 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.567 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.567 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.567 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.567 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.567 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.568 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.568 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.568 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.568 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.568 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.569 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.569 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.569 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.569 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.570 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.571 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.571 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.571 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.571 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.572 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.572 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.572 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.572 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.573 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.573 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.573 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.573 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.574 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.574 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.574 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.574 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.574 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.575 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.575 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.575 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.575 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.575 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.576 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.576 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.576 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.576 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.576 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.577 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.577 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.577 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.577 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.577 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.578 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.578 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.578 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.578 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.578 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.579 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.579 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.579 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.579 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.580 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.580 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.580 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.580 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.580 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.581 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.581 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.581 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.581 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.582 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.582 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.582 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.582 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.582 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.583 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.583 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.583 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.583 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.584 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.584 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.584 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.584 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.584 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.585 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.585 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.585 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.585 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.586 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.586 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.586 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.586 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.586 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.587 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.587 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.587 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.587 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.588 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.588 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.588 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.588 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.588 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.589 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.589 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.589 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.589 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.589 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.590 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.591 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.591 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.591 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.591 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.592 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.593 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.594 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.595 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.596 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.597 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.598 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.599 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.599 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.599 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.599 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.599 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.600 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.601 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.601 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.601 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.601 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.601 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.602 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.602 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.602 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.602 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.602 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.603 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.603 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.603 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.603 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.604 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.605 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.606 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.607 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.608 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.609 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.610 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.611 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.612 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.613 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.613 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.613 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.613 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.613 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.614 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.615 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.616 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.617 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.617 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.617 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.617 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.617 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.618 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.619 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.620 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.621 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.622 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.623 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.624 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.625 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.626 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.627 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.628 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.629 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.629 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.629 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.629 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.629 224805 DEBUG oslo_service.service [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.630 224805 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.647 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.647 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.648 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.648 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 29 01:43:04 np0005539509 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 01:43:04 np0005539509 systemd[1]: Started libvirt QEMU daemon.
Nov 29 01:43:04 np0005539509 python3.9[225317]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.780 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd834e6c790> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.783 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd834e6c790> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.784 224805 INFO nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.800 224805 WARNING nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Nov 29 01:43:04 np0005539509 nova_compute[224801]: 2025-11-29 06:43:04.802 224805 DEBUG nova.virt.libvirt.volume.mount [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 29 01:43:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:05.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:05 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.707 224805 INFO nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 01:43:05 np0005539509 nova_compute[224801]: 
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <host>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <uuid>6289b14c-9d0e-4084-a899-2566f6eb59ac</uuid>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <cpu>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <arch>x86_64</arch>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model>EPYC-Rome-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <vendor>AMD</vendor>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <microcode version='16777317'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <signature family='23' model='49' stepping='0'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='x2apic'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='tsc-deadline'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='osxsave'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='hypervisor'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='tsc_adjust'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='spec-ctrl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='stibp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='arch-capabilities'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='cmp_legacy'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='topoext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='virt-ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='lbrv'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='tsc-scale'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='vmcb-clean'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='pause-filter'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='pfthreshold'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='svme-addr-chk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='rdctl-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='skip-l1dfl-vmentry'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='mds-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature name='pschange-mc-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <pages unit='KiB' size='4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <pages unit='KiB' size='2048'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <pages unit='KiB' size='1048576'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </cpu>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <power_management>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <suspend_mem/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </power_management>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <iommu support='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <migration_features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <live/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <uri_transports>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <uri_transport>tcp</uri_transport>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <uri_transport>rdma</uri_transport>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </uri_transports>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </migration_features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <topology>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <cells num='1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <cell id='0'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:          <memory unit='KiB'>7864316</memory>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:          <pages unit='KiB' size='4'>1966079</pages>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:          <pages unit='KiB' size='2048'>0</pages>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:          <distances>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:            <sibling id='0' value='10'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:          </distances>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:          <cpus num='8'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:          </cpus>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        </cell>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </cells>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </topology>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <cache>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </cache>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <secmodel>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model>selinux</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <doi>0</doi>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </secmodel>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <secmodel>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model>dac</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <doi>0</doi>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </secmodel>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </host>
Nov 29 01:43:05 np0005539509 nova_compute[224801]: 
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <guest>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <os_type>hvm</os_type>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <arch name='i686'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <wordsize>32</wordsize>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <domain type='qemu'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <domain type='kvm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </arch>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <pae/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <nonpae/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <acpi default='on' toggle='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <apic default='on' toggle='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <cpuselection/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <deviceboot/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <externalSnapshot/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </guest>
Nov 29 01:43:05 np0005539509 nova_compute[224801]: 
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <guest>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <os_type>hvm</os_type>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <arch name='x86_64'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <wordsize>64</wordsize>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <domain type='qemu'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <domain type='kvm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </arch>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <acpi default='on' toggle='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <apic default='on' toggle='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <cpuselection/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <deviceboot/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <externalSnapshot/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </guest>
Nov 29 01:43:05 np0005539509 nova_compute[224801]: 
Nov 29 01:43:05 np0005539509 nova_compute[224801]: </capabilities>
Nov 29 01:43:05 np0005539509 nova_compute[224801]: #033[00m
Nov 29 01:43:05 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.717 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:43:05 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.742 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 01:43:05 np0005539509 nova_compute[224801]: <domainCapabilities>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <domain>kvm</domain>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <arch>i686</arch>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <vcpu max='240'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <iothreads supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <os supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <enum name='firmware'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <loader supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>rom</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pflash</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='readonly'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>yes</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>no</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='secure'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>no</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </loader>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </os>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <cpu>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>on</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>off</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='maximumMigratable'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>on</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>off</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <vendor>AMD</vendor>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='succor'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='custom' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-128'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-256'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-512'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='KnightsMill'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SierraForest'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='athlon'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='athlon-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='core2duo'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='core2duo-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='coreduo'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='coreduo-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='n270'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='n270-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='phenom'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='phenom-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </cpu>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <memoryBacking supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <enum name='sourceType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>file</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>anonymous</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>memfd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </memoryBacking>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <devices>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <disk supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='diskDevice'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>disk</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>cdrom</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>floppy</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>lun</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='bus'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>ide</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>fdc</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>scsi</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>sata</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </disk>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <graphics supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vnc</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>egl-headless</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>dbus</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </graphics>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <video supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='modelType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vga</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>cirrus</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>none</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>bochs</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>ramfb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </video>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <hostdev supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='mode'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>subsystem</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='startupPolicy'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>default</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>mandatory</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>requisite</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>optional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='subsysType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pci</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>scsi</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='capsType'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='pciBackend'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </hostdev>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <rng supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>random</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>egd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>builtin</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </rng>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <filesystem supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='driverType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>path</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>handle</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtiofs</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </filesystem>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <tpm supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tpm-tis</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tpm-crb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>emulator</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>external</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendVersion'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>2.0</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </tpm>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <redirdev supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='bus'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </redirdev>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <channel supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pty</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>unix</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </channel>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <crypto supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>qemu</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>builtin</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </crypto>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <interface supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>default</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>passt</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </interface>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <panic supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>isa</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>hyperv</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </panic>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <console supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>null</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vc</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pty</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>dev</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>file</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pipe</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>stdio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>udp</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tcp</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>unix</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>qemu-vdagent</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>dbus</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </console>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </devices>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <gic supported='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <genid supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <backup supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <async-teardown supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <ps2 supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <sev supported='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <sgx supported='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <hyperv supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='features'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>relaxed</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vapic</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>spinlocks</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vpindex</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>runtime</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>synic</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>stimer</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>reset</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vendor_id</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>frequencies</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>reenlightenment</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tlbflush</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>ipi</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>avic</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>emsr_bitmap</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>xmm_input</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <defaults>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </defaults>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </hyperv>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <launchSecurity supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='sectype'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tdx</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </launchSecurity>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]: </domainCapabilities>
Nov 29 01:43:05 np0005539509 nova_compute[224801]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:05 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.751 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 01:43:05 np0005539509 nova_compute[224801]: <domainCapabilities>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <domain>kvm</domain>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <arch>i686</arch>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <vcpu max='4096'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <iothreads supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <os supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <enum name='firmware'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <loader supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>rom</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pflash</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='readonly'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>yes</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>no</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='secure'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>no</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </loader>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </os>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <cpu>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>on</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>off</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='maximumMigratable'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>on</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>off</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <vendor>AMD</vendor>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='succor'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='custom' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-128'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-256'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-512'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='KnightsMill'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SierraForest'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='athlon'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='athlon-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='core2duo'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='core2duo-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='coreduo'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='coreduo-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='n270'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='n270-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='phenom'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='phenom-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </cpu>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <memoryBacking supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <enum name='sourceType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>file</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>anonymous</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>memfd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </memoryBacking>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <devices>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <disk supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='diskDevice'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>disk</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>cdrom</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>floppy</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>lun</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='bus'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>fdc</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>scsi</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>sata</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </disk>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <graphics supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vnc</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>egl-headless</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>dbus</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </graphics>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <video supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='modelType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vga</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>cirrus</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>none</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>bochs</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>ramfb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </video>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <hostdev supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='mode'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>subsystem</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='startupPolicy'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>default</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>mandatory</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>requisite</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>optional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='subsysType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pci</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>scsi</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='capsType'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='pciBackend'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </hostdev>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <rng supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>random</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>egd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>builtin</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </rng>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <filesystem supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='driverType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>path</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>handle</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtiofs</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </filesystem>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <tpm supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tpm-tis</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tpm-crb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>emulator</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>external</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendVersion'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>2.0</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </tpm>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <redirdev supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='bus'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </redirdev>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <channel supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pty</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>unix</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </channel>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <crypto supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>qemu</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>builtin</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </crypto>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <interface supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>default</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>passt</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </interface>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <panic supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>isa</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>hyperv</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </panic>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <console supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>null</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vc</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pty</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>dev</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>file</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pipe</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>stdio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>udp</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tcp</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>unix</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>qemu-vdagent</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>dbus</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </console>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </devices>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <gic supported='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <genid supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <backup supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <async-teardown supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <ps2 supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <sev supported='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <sgx supported='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <hyperv supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='features'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>relaxed</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vapic</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>spinlocks</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vpindex</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>runtime</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>synic</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>stimer</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>reset</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vendor_id</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>frequencies</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>reenlightenment</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tlbflush</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>ipi</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>avic</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>emsr_bitmap</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>xmm_input</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <defaults>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </defaults>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </hyperv>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <launchSecurity supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='sectype'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tdx</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </launchSecurity>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]: </domainCapabilities>
Nov 29 01:43:05 np0005539509 nova_compute[224801]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:05 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.794 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:43:05 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.799 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 01:43:05 np0005539509 nova_compute[224801]: <domainCapabilities>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <domain>kvm</domain>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <arch>x86_64</arch>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <vcpu max='240'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <iothreads supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <os supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <enum name='firmware'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <loader supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>rom</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pflash</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='readonly'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>yes</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>no</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='secure'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>no</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </loader>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </os>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <cpu>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>on</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>off</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='maximumMigratable'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>on</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>off</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <vendor>AMD</vendor>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='succor'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='custom' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-128'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-256'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-512'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='KnightsMill'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SierraForest'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='athlon'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='athlon-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='core2duo'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='core2duo-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='coreduo'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='coreduo-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='n270'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='n270-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='phenom'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='phenom-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </cpu>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <memoryBacking supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <enum name='sourceType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>file</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>anonymous</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>memfd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </memoryBacking>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <devices>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <disk supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='diskDevice'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>disk</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>cdrom</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>floppy</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>lun</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='bus'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>ide</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>fdc</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>scsi</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>sata</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </disk>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <graphics supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vnc</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>egl-headless</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>dbus</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </graphics>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <video supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='modelType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vga</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>cirrus</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>none</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>bochs</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>ramfb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </video>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <hostdev supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='mode'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>subsystem</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='startupPolicy'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>default</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>mandatory</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>requisite</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>optional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='subsysType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pci</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>scsi</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='capsType'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='pciBackend'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </hostdev>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <rng supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>random</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>egd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>builtin</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </rng>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <filesystem supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='driverType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>path</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>handle</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>virtiofs</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </filesystem>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <tpm supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tpm-tis</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tpm-crb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>emulator</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>external</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendVersion'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>2.0</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </tpm>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <redirdev supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='bus'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </redirdev>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <channel supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pty</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>unix</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </channel>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <crypto supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>qemu</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>builtin</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </crypto>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <interface supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='backendType'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>default</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>passt</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </interface>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <panic supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>isa</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>hyperv</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </panic>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <console supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>null</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vc</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pty</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>dev</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>file</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pipe</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>stdio</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>udp</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tcp</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>unix</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>qemu-vdagent</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>dbus</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </console>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </devices>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <gic supported='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <genid supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <backup supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <async-teardown supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <ps2 supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <sev supported='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <sgx supported='no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <hyperv supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='features'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>relaxed</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vapic</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>spinlocks</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vpindex</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>runtime</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>synic</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>stimer</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>reset</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>vendor_id</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>frequencies</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>reenlightenment</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tlbflush</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>ipi</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>avic</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>emsr_bitmap</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>xmm_input</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <defaults>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </defaults>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </hyperv>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <launchSecurity supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='sectype'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>tdx</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </launchSecurity>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </features>
Nov 29 01:43:05 np0005539509 nova_compute[224801]: </domainCapabilities>
Nov 29 01:43:05 np0005539509 nova_compute[224801]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:05 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.861 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 01:43:05 np0005539509 nova_compute[224801]: <domainCapabilities>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <domain>kvm</domain>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <arch>x86_64</arch>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <vcpu max='4096'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <iothreads supported='yes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <os supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <enum name='firmware'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>efi</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <loader supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>rom</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>pflash</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='readonly'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>yes</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>no</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='secure'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>yes</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>no</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </loader>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  </os>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:  <cpu>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>on</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>off</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <enum name='maximumMigratable'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>on</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <value>off</value>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <vendor>AMD</vendor>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='succor'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:    <mode name='custom' supported='yes'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Denverton-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='EPYC-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-128'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-256'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx10-512'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Haswell-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:05 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='KnightsMill'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512er'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512pf'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512er'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512pf'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G4'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G5'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='tbm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fma4'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='tbm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xop'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-bf16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-int8'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='amx-tile'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-bf16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-fp16'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bitalg'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512ifma'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vnni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrc'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fzrm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='la57'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='taa-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xfd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='SierraForest'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-ifma'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='cmpccxadd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-ifma'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-vnni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='cmpccxadd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fbsdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='fsrs'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ibrs-all'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='mcdt-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pbrsb-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='psdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='serialize'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vaes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='hle'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='rtm'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512bw'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512cd'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512dq'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512f'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='avx512vl'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='invpcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pcid'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='pku'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Snowridge'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='mpx'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='core-capability'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='split-lock-detect'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='cldemote'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='erms'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='gfni'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdir64b'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='movdiri'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='xsaves'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='athlon'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='athlon-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='core2duo'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='core2duo-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='coreduo'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='coreduo-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='n270'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='n270-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='ss'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='phenom'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <blockers model='phenom-v1'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='3dnow'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <feature name='3dnowext'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </blockers>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </mode>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:  </cpu>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:  <memoryBacking supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <enum name='sourceType'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <value>file</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <value>anonymous</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <value>memfd</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:  </memoryBacking>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:  <devices>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <disk supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='diskDevice'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>disk</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>cdrom</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>floppy</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>lun</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='bus'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>fdc</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>scsi</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>sata</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>virtio-transitional</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>virtio-non-transitional</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </disk>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <graphics supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>vnc</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>egl-headless</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>dbus</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </graphics>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <video supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='modelType'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>vga</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>cirrus</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>none</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>bochs</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>ramfb</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </video>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <hostdev supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='mode'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>subsystem</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='startupPolicy'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>default</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>mandatory</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>requisite</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>optional</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='subsysType'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>pci</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>scsi</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='capsType'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='pciBackend'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </hostdev>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <rng supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>virtio</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>virtio-transitional</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>virtio-non-transitional</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>random</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>egd</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>builtin</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </rng>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <filesystem supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='driverType'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>path</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>handle</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>virtiofs</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </filesystem>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <tpm supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>tpm-tis</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>tpm-crb</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>emulator</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>external</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='backendVersion'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>2.0</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </tpm>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <redirdev supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='bus'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>usb</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </redirdev>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <channel supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>pty</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>unix</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </channel>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <crypto supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='model'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>qemu</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='backendModel'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>builtin</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </crypto>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <interface supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='backendType'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>default</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>passt</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </interface>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <panic supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='model'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>isa</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>hyperv</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </panic>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <console supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='type'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>null</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>vc</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>pty</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>dev</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>file</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>pipe</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>stdio</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>udp</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>tcp</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>unix</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>qemu-vdagent</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>dbus</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </console>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:  </devices>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:  <features>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <gic supported='no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <genid supported='yes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <backup supported='yes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <async-teardown supported='yes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <ps2 supported='yes'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <sev supported='no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <sgx supported='no'/>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <hyperv supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='features'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>relaxed</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>vapic</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>spinlocks</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>vpindex</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>runtime</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>synic</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>stimer</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>reset</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>vendor_id</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>frequencies</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>reenlightenment</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>tlbflush</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>ipi</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>avic</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>emsr_bitmap</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>xmm_input</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <defaults>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </defaults>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </hyperv>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    <launchSecurity supported='yes'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      <enum name='sectype'>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:        <value>tdx</value>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:      </enum>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:    </launchSecurity>
Nov 29 01:43:06 np0005539509 nova_compute[224801]:  </features>
Nov 29 01:43:06 np0005539509 nova_compute[224801]: </domainCapabilities>
Nov 29 01:43:06 np0005539509 nova_compute[224801]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.930 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.930 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.931 224805 DEBUG nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.931 224805 INFO nova.virt.libvirt.host [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Secure Boot support detected#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.933 224805 INFO nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.933 224805 INFO nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.946 224805 DEBUG nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 01:43:06 np0005539509 nova_compute[224801]:  <model>Nehalem</model>
Nov 29 01:43:06 np0005539509 nova_compute[224801]: </cpu>
Nov 29 01:43:06 np0005539509 nova_compute[224801]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.950 224805 DEBUG nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:05.984 224805 INFO nova.virt.node [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Determined node identity 774921e7-1fd5-4281-8c90-f7cd3ee5e01b from /var/lib/nova/compute_id#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.013 224805 WARNING nova.compute.manager [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Compute nodes ['774921e7-1fd5-4281-8c90-f7cd3ee5e01b'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Nov 29 01:43:06 np0005539509 python3.9[225533]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.061 224805 INFO nova.compute.manager [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 29 01:43:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:06.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.189 224805 WARNING nova.compute.manager [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.189 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.190 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.190 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.190 224805 DEBUG nova.compute.resource_tracker [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.191 224805 DEBUG oslo_concurrency.processutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:43:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:43:06 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3797343968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.649 224805 DEBUG oslo_concurrency.processutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:43:06 np0005539509 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:43:06 np0005539509 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 01:43:06 np0005539509 systemd[1]: Started libvirt nodedev daemon.
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.983 224805 WARNING nova.virt.libvirt.driver [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.985 224805 DEBUG nova.compute.resource_tracker [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5341MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.985 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:06 np0005539509 nova_compute[224801]: 2025-11-29 06:43:06.986 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:07 np0005539509 nova_compute[224801]: 2025-11-29 06:43:07.039 224805 WARNING nova.compute.resource_tracker [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] No compute node record for compute-1.ctlplane.example.com:774921e7-1fd5-4281-8c90-f7cd3ee5e01b: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 774921e7-1fd5-4281-8c90-f7cd3ee5e01b could not be found.#033[00m
Nov 29 01:43:07 np0005539509 python3.9[225751]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:43:07 np0005539509 systemd[1]: Stopping nova_compute container...
Nov 29 01:43:07 np0005539509 nova_compute[224801]: 2025-11-29 06:43:07.244 224805 DEBUG oslo_concurrency.lockutils [None req-047885df-1b64-45be-8f37-cc1ecad6c49c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:07 np0005539509 nova_compute[224801]: 2025-11-29 06:43:07.245 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:43:07 np0005539509 nova_compute[224801]: 2025-11-29 06:43:07.245 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:43:07 np0005539509 nova_compute[224801]: 2025-11-29 06:43:07.245 224805 DEBUG oslo_concurrency.lockutils [None req-d47bae67-0a67-4a70-9f28-1a3fab1ca752 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:43:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:07.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:07 np0005539509 virtqemud[225339]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 29 01:43:07 np0005539509 virtqemud[225339]: hostname: compute-1
Nov 29 01:43:07 np0005539509 virtqemud[225339]: End of file while reading data: Input/output error
Nov 29 01:43:07 np0005539509 systemd[1]: libpod-31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955.scope: Deactivated successfully.
Nov 29 01:43:07 np0005539509 systemd[1]: libpod-31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955.scope: Consumed 4.073s CPU time.
Nov 29 01:43:07 np0005539509 podman[225757]: 2025-11-29 06:43:07.950661434 +0000 UTC m=+0.755315722 container died 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:43:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:08.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:08 np0005539509 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955-userdata-shm.mount: Deactivated successfully.
Nov 29 01:43:08 np0005539509 systemd[1]: var-lib-containers-storage-overlay-545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48-merged.mount: Deactivated successfully.
Nov 29 01:43:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:09.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:10.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:11 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:11 np0005539509 podman[225757]: 2025-11-29 06:43:11.302740628 +0000 UTC m=+4.107394926 container cleanup 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 01:43:11 np0005539509 podman[225757]: nova_compute
Nov 29 01:43:11 np0005539509 podman[225787]: nova_compute
Nov 29 01:43:11 np0005539509 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 29 01:43:11 np0005539509 systemd[1]: Stopped nova_compute container.
Nov 29 01:43:11 np0005539509 systemd[1]: Starting nova_compute container...
Nov 29 01:43:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:11.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:11 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:43:11 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:11 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:11 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:11 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:11 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/545e640a0f5891e63c6c8d5b83775cef2a6d6a7e6708a84d8582271152386f48/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:11 np0005539509 podman[225800]: 2025-11-29 06:43:11.586750463 +0000 UTC m=+0.179945241 container init 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:43:11 np0005539509 podman[225800]: 2025-11-29 06:43:11.5992946 +0000 UTC m=+0.192489358 container start 31436e1d87b0e6426a33162924f7a3a27c57e2695288c415fc1a417e02a19955 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:43:11 np0005539509 podman[225800]: nova_compute
Nov 29 01:43:11 np0005539509 nova_compute[225815]: + sudo -E kolla_set_configs
Nov 29 01:43:11 np0005539509 systemd[1]: Started nova_compute container.
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Validating config file
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying service configuration files
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Deleting /etc/ceph
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Creating directory /etc/ceph
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Writing out command to execute
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:11 np0005539509 nova_compute[225815]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:43:11 np0005539509 nova_compute[225815]: ++ cat /run_command
Nov 29 01:43:11 np0005539509 nova_compute[225815]: + CMD=nova-compute
Nov 29 01:43:11 np0005539509 nova_compute[225815]: + ARGS=
Nov 29 01:43:11 np0005539509 nova_compute[225815]: + sudo kolla_copy_cacerts
Nov 29 01:43:11 np0005539509 nova_compute[225815]: + [[ ! -n '' ]]
Nov 29 01:43:11 np0005539509 nova_compute[225815]: + . kolla_extend_start
Nov 29 01:43:11 np0005539509 nova_compute[225815]: Running command: 'nova-compute'
Nov 29 01:43:11 np0005539509 nova_compute[225815]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 01:43:11 np0005539509 nova_compute[225815]: + umask 0022
Nov 29 01:43:11 np0005539509 nova_compute[225815]: + exec nova-compute
Nov 29 01:43:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:12.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:12 np0005539509 python3.9[225978]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 01:43:12 np0005539509 systemd[1]: Started libpod-conmon-3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d.scope.
Nov 29 01:43:12 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:43:12 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dff9f09fc74a0a77d8626d02655dc1c388d815775966403bd82792086be1b196/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:12 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dff9f09fc74a0a77d8626d02655dc1c388d815775966403bd82792086be1b196/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:12 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dff9f09fc74a0a77d8626d02655dc1c388d815775966403bd82792086be1b196/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:12 np0005539509 podman[226005]: 2025-11-29 06:43:12.820117997 +0000 UTC m=+0.151107221 container init 3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:43:12 np0005539509 podman[226005]: 2025-11-29 06:43:12.8314125 +0000 UTC m=+0.162401704 container start 3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 29 01:43:12 np0005539509 python3.9[225978]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Applying nova statedir ownership
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 29 01:43:12 np0005539509 nova_compute_init[226026]: INFO:nova_statedir:Nova statedir ownership complete
Nov 29 01:43:12 np0005539509 systemd[1]: libpod-3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d.scope: Deactivated successfully.
Nov 29 01:43:12 np0005539509 podman[226041]: 2025-11-29 06:43:12.938148969 +0000 UTC m=+0.030547438 container died 3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0)
Nov 29 01:43:12 np0005539509 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d-userdata-shm.mount: Deactivated successfully.
Nov 29 01:43:12 np0005539509 systemd[1]: var-lib-containers-storage-overlay-dff9f09fc74a0a77d8626d02655dc1c388d815775966403bd82792086be1b196-merged.mount: Deactivated successfully.
Nov 29 01:43:12 np0005539509 podman[226041]: 2025-11-29 06:43:12.989555074 +0000 UTC m=+0.081953543 container cleanup 3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm)
Nov 29 01:43:12 np0005539509 systemd[1]: libpod-conmon-3d997eba43ccb4b7355390688a730e3f999a30df960db03d58e0027c6a107f6d.scope: Deactivated successfully.
Nov 29 01:43:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:13.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:13 np0005539509 nova_compute[225815]: 2025-11-29 06:43:13.782 225819 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:13 np0005539509 nova_compute[225815]: 2025-11-29 06:43:13.782 225819 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:13 np0005539509 nova_compute[225815]: 2025-11-29 06:43:13.782 225819 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:13 np0005539509 nova_compute[225815]: 2025-11-29 06:43:13.782 225819 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 29 01:43:13 np0005539509 nova_compute[225815]: 2025-11-29 06:43:13.923 225819 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:43:13 np0005539509 nova_compute[225815]: 2025-11-29 06:43:13.956 225819 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:43:13 np0005539509 nova_compute[225815]: 2025-11-29 06:43:13.957 225819 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 01:43:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:14.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:15.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:43:15.911 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:43:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:43:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:16.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:16 np0005539509 systemd[1]: session-49.scope: Deactivated successfully.
Nov 29 01:43:16 np0005539509 systemd[1]: session-49.scope: Consumed 2min 31.856s CPU time.
Nov 29 01:43:16 np0005539509 systemd-logind[785]: Session 49 logged out. Waiting for processes to exit.
Nov 29 01:43:16 np0005539509 systemd-logind[785]: Removed session 49.
Nov 29 01:43:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:17.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:17 np0005539509 nova_compute[225815]: 2025-11-29 06:43:17.686 225819 INFO nova.virt.driver [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 29 01:43:17 np0005539509 nova_compute[225815]: 2025-11-29 06:43:17.812 225819 INFO nova.compute.provider_config [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 29 01:43:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:18.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.314 225819 DEBUG oslo_concurrency.lockutils [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.315 225819 DEBUG oslo_concurrency.lockutils [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.315 225819 DEBUG oslo_concurrency.lockutils [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.316 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.317 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.318 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.319 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.320 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.321 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.322 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.323 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.324 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.325 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.326 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.327 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.328 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.329 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.330 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.331 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.332 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.333 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.334 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.335 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.336 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.337 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.338 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.339 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.339 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.339 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.339 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.339 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.340 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.340 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.340 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.340 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.341 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.341 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.341 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.341 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.342 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.342 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.342 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.342 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.343 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.343 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.343 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.343 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.343 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.344 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.344 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.344 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.344 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.344 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.345 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.345 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.345 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.345 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.346 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.347 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.347 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.347 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.347 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.347 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.348 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.349 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.350 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.351 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.352 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.353 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.354 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.355 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.356 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.357 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.358 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.359 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.360 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.361 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.361 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.361 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.361 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.361 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.362 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.363 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.364 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.365 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.366 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.367 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.368 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.369 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.369 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.369 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.369 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.369 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.370 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.371 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.372 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.372 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.372 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.372 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.372 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.373 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.374 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.375 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.376 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.377 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.378 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.379 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.380 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.381 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.382 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.383 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.384 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.385 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.386 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.387 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.387 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.387 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.387 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.387 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.388 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.388 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.389 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.390 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.391 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.392 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.393 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.394 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.394 225819 WARNING oslo_config.cfg [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 01:43:19 np0005539509 nova_compute[225815]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 01:43:19 np0005539509 nova_compute[225815]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 01:43:19 np0005539509 nova_compute[225815]: and ``live_migration_inbound_addr`` respectively.
Nov 29 01:43:19 np0005539509 nova_compute[225815]: ).  Its value may be silently ignored in the future.#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.394 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.394 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.394 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.395 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.396 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rbd_secret_uuid        = 336ec58c-893b-528f-a0c1-6ed1196bc047 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.397 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.398 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.399 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.400 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.401 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.401 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.401 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.401 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.401 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.402 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.402 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.402 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.402 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.402 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.403 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.404 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.405 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.406 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.407 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.408 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.409 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.410 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.411 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.412 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.413 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.414 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.414 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.414 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.414 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.414 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.415 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.415 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.415 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.415 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.415 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.416 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.416 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.416 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.416 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.416 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.417 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.417 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.417 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.417 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.417 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.418 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.418 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.418 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.418 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.419 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.419 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.419 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.419 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.419 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.420 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.421 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.421 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.421 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.421 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.422 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.422 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.422 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.422 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.422 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.423 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.423 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.423 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.423 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.423 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.424 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.424 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.424 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.424 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.424 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.425 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.425 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.425 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.425 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.425 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.426 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.426 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.426 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.426 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.427 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.427 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.427 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.427 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.427 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.428 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.428 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.428 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.428 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.428 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.429 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.429 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.429 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.429 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.429 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.430 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.430 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.430 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.430 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.431 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.431 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.431 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.431 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.431 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.432 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.433 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.433 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.433 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.433 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.433 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.434 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.434 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.434 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.434 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.435 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:19.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.436 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.437 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.437 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.437 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.437 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.437 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.438 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.438 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.438 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.438 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.438 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.439 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.439 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.439 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.439 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.439 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.440 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.440 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.440 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.440 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.441 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.442 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.442 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.442 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.442 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.442 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.443 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.443 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.443 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.443 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.444 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.444 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.444 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.444 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.444 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.445 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.445 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.445 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.445 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.445 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.446 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.446 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.446 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.446 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.446 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.447 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.448 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.448 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.448 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.448 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.448 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.449 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.449 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.449 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.449 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.449 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.450 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.451 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.451 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.451 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.451 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.451 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.452 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.452 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.452 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.452 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.453 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.454 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.455 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.456 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.456 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.456 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.456 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.456 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.457 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.458 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.459 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.459 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.459 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.459 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.459 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.460 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.461 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.462 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.463 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.464 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.465 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.466 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.467 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.468 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.469 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.470 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.470 225819 DEBUG oslo_service.service [None req-597add21-e28d-4b64-9dad-462381afd2da - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:43:19 np0005539509 nova_compute[225815]: 2025-11-29 06:43:19.471 225819 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 29 01:43:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:20.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.222 225819 INFO nova.virt.node [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Determined node identity 774921e7-1fd5-4281-8c90-f7cd3ee5e01b from /var/lib/nova/compute_id#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.224 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.225 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.225 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.226 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.242 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd5043b5fa0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.245 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd5043b5fa0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.246 225819 INFO nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.257 225819 INFO nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <host>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <uuid>6289b14c-9d0e-4084-a899-2566f6eb59ac</uuid>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <cpu>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <arch>x86_64</arch>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model>EPYC-Rome-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <vendor>AMD</vendor>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <microcode version='16777317'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <signature family='23' model='49' stepping='0'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='x2apic'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='tsc-deadline'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='osxsave'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='hypervisor'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='tsc_adjust'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='spec-ctrl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='stibp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='arch-capabilities'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='cmp_legacy'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='topoext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='virt-ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='lbrv'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='tsc-scale'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='vmcb-clean'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='pause-filter'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='pfthreshold'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='svme-addr-chk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='rdctl-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='skip-l1dfl-vmentry'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='mds-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature name='pschange-mc-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <pages unit='KiB' size='4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <pages unit='KiB' size='2048'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <pages unit='KiB' size='1048576'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </cpu>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <power_management>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <suspend_mem/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </power_management>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <iommu support='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <migration_features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <live/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <uri_transports>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <uri_transport>tcp</uri_transport>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <uri_transport>rdma</uri_transport>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </uri_transports>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </migration_features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <topology>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <cells num='1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <cell id='0'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:          <memory unit='KiB'>7864316</memory>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:          <pages unit='KiB' size='4'>1966079</pages>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:          <pages unit='KiB' size='2048'>0</pages>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:          <distances>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:            <sibling id='0' value='10'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:          </distances>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:          <cpus num='8'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:          </cpus>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        </cell>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </cells>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </topology>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <cache>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </cache>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <secmodel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model>selinux</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <doi>0</doi>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </secmodel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <secmodel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model>dac</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <doi>0</doi>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </secmodel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </host>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <guest>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <os_type>hvm</os_type>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <arch name='i686'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <wordsize>32</wordsize>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <domain type='qemu'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <domain type='kvm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </arch>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <pae/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <nonpae/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <acpi default='on' toggle='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <apic default='on' toggle='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <cpuselection/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <deviceboot/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <externalSnapshot/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </guest>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <guest>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <os_type>hvm</os_type>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <arch name='x86_64'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <wordsize>64</wordsize>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <domain type='qemu'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <domain type='kvm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </arch>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <acpi default='on' toggle='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <apic default='on' toggle='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <cpuselection/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <deviceboot/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <externalSnapshot/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </guest>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 
Nov 29 01:43:20 np0005539509 nova_compute[225815]: </capabilities>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: #033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.264 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.269 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 01:43:20 np0005539509 nova_compute[225815]: <domainCapabilities>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <domain>kvm</domain>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <arch>i686</arch>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <vcpu max='4096'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <iothreads supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <os supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <enum name='firmware'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <loader supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>rom</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pflash</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='readonly'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>yes</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>no</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='secure'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>no</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </loader>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </os>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <cpu>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>on</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>off</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='maximumMigratable'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>on</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>off</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <vendor>AMD</vendor>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='succor'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='custom' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='auto-ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='auto-ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-128'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-256'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-512'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='KnightsMill'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512er'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512pf'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512er'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512pf'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tbm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tbm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SierraForest'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cmpccxadd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cmpccxadd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='athlon'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='athlon-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='core2duo'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='core2duo-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='coreduo'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='coreduo-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='n270'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='n270-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='phenom'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='phenom-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </cpu>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <memoryBacking supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <enum name='sourceType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>file</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>anonymous</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>memfd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </memoryBacking>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <devices>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <disk supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='diskDevice'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>disk</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>cdrom</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>floppy</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>lun</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='bus'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>fdc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>scsi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>sata</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-non-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </disk>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <graphics supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vnc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>egl-headless</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dbus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </graphics>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <video supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='modelType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vga</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>cirrus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>none</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>bochs</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>ramfb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </video>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <hostdev supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='mode'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>subsystem</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='startupPolicy'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>default</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>mandatory</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>requisite</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>optional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='subsysType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pci</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>scsi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='capsType'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='pciBackend'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </hostdev>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <rng supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-non-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>random</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>egd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>builtin</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </rng>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <filesystem supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='driverType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>path</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>handle</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtiofs</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </filesystem>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <tpm supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tpm-tis</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tpm-crb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>emulator</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>external</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendVersion'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>2.0</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </tpm>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <redirdev supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='bus'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </redirdev>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <channel supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pty</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>unix</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </channel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <crypto supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>qemu</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>builtin</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </crypto>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <interface supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>default</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>passt</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </interface>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <panic supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>isa</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>hyperv</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </panic>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <console supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>null</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pty</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dev</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>file</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pipe</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>stdio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>udp</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tcp</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>unix</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>qemu-vdagent</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dbus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </console>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </devices>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <gic supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <genid supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <backup supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <async-teardown supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <ps2 supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <sev supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <sgx supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <hyperv supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='features'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>relaxed</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vapic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>spinlocks</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vpindex</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>runtime</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>synic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>stimer</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>reset</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vendor_id</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>frequencies</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>reenlightenment</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tlbflush</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>ipi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>avic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>emsr_bitmap</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>xmm_input</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <defaults>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </defaults>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </hyperv>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <launchSecurity supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='sectype'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tdx</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </launchSecurity>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: </domainCapabilities>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.275 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 01:43:20 np0005539509 nova_compute[225815]: <domainCapabilities>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <domain>kvm</domain>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <arch>i686</arch>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <vcpu max='240'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <iothreads supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <os supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <enum name='firmware'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <loader supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>rom</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pflash</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='readonly'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>yes</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>no</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='secure'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>no</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </loader>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </os>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <cpu>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>on</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>off</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='maximumMigratable'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>on</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>off</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <vendor>AMD</vendor>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='succor'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='custom' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='auto-ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='auto-ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-128'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-256'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-512'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='KnightsMill'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512er'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512pf'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512er'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512pf'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tbm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tbm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SierraForest'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cmpccxadd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cmpccxadd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='athlon'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='athlon-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='core2duo'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='core2duo-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='coreduo'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='coreduo-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='n270'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='n270-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='phenom'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='phenom-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </cpu>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <memoryBacking supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <enum name='sourceType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>file</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>anonymous</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>memfd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </memoryBacking>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <devices>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <disk supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='diskDevice'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>disk</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>cdrom</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>floppy</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>lun</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='bus'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>ide</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>fdc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>scsi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>sata</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-non-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </disk>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <graphics supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vnc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>egl-headless</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dbus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </graphics>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <video supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='modelType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vga</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>cirrus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>none</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>bochs</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>ramfb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </video>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <hostdev supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='mode'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>subsystem</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='startupPolicy'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>default</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>mandatory</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>requisite</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>optional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='subsysType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pci</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>scsi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='capsType'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='pciBackend'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </hostdev>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <rng supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-non-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>random</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>egd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>builtin</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </rng>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <filesystem supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='driverType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>path</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>handle</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtiofs</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </filesystem>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <tpm supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tpm-tis</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tpm-crb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>emulator</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>external</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendVersion'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>2.0</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </tpm>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <redirdev supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='bus'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </redirdev>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <channel supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pty</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>unix</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </channel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <crypto supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>qemu</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>builtin</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </crypto>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <interface supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>default</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>passt</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </interface>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <panic supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>isa</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>hyperv</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </panic>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <console supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>null</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pty</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dev</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>file</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pipe</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>stdio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>udp</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tcp</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>unix</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>qemu-vdagent</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dbus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </console>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </devices>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <gic supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <genid supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <backup supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <async-teardown supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <ps2 supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <sev supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <sgx supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <hyperv supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='features'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>relaxed</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vapic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>spinlocks</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vpindex</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>runtime</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>synic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>stimer</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>reset</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vendor_id</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>frequencies</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>reenlightenment</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tlbflush</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>ipi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>avic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>emsr_bitmap</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>xmm_input</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <defaults>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </defaults>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </hyperv>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <launchSecurity supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='sectype'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tdx</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </launchSecurity>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: </domainCapabilities>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.326 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.331 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 01:43:20 np0005539509 nova_compute[225815]: <domainCapabilities>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <domain>kvm</domain>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <arch>x86_64</arch>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <vcpu max='4096'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <iothreads supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <os supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <enum name='firmware'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>efi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <loader supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>rom</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pflash</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='readonly'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>yes</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>no</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='secure'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>yes</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>no</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </loader>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </os>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <cpu>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>on</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>off</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='maximumMigratable'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>on</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>off</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <vendor>AMD</vendor>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='succor'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='custom' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='auto-ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='auto-ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-128'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-256'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-512'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='KnightsMill'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512er'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512pf'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512er'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512pf'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tbm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tbm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SierraForest'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cmpccxadd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cmpccxadd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='athlon'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='athlon-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='core2duo'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='core2duo-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='coreduo'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='coreduo-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='n270'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='n270-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='phenom'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='phenom-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </cpu>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <memoryBacking supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <enum name='sourceType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>file</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>anonymous</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>memfd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </memoryBacking>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <devices>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <disk supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='diskDevice'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>disk</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>cdrom</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>floppy</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>lun</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='bus'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>fdc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>scsi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>sata</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-non-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </disk>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <graphics supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vnc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>egl-headless</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dbus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </graphics>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <video supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='modelType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vga</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>cirrus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>none</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>bochs</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>ramfb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </video>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <hostdev supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='mode'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>subsystem</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='startupPolicy'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>default</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>mandatory</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>requisite</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>optional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='subsysType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pci</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>scsi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='capsType'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='pciBackend'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </hostdev>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <rng supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-non-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>random</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>egd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>builtin</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </rng>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <filesystem supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='driverType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>path</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>handle</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtiofs</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </filesystem>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <tpm supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tpm-tis</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tpm-crb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>emulator</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>external</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendVersion'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>2.0</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </tpm>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <redirdev supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='bus'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </redirdev>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <channel supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pty</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>unix</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </channel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <crypto supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>qemu</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>builtin</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </crypto>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <interface supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>default</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>passt</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </interface>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <panic supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>isa</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>hyperv</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </panic>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <console supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>null</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pty</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dev</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>file</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pipe</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>stdio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>udp</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tcp</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>unix</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>qemu-vdagent</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dbus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </console>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </devices>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <gic supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <genid supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <backup supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <async-teardown supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <ps2 supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <sev supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <sgx supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <hyperv supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='features'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>relaxed</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vapic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>spinlocks</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vpindex</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>runtime</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>synic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>stimer</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>reset</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vendor_id</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>frequencies</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>reenlightenment</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tlbflush</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>ipi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>avic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>emsr_bitmap</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>xmm_input</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <defaults>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </defaults>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </hyperv>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <launchSecurity supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='sectype'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tdx</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </launchSecurity>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: </domainCapabilities>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.399 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 01:43:20 np0005539509 nova_compute[225815]: <domainCapabilities>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <domain>kvm</domain>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <arch>x86_64</arch>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <vcpu max='240'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <iothreads supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <os supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <enum name='firmware'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <loader supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>rom</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pflash</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='readonly'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>yes</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>no</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='secure'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>no</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </loader>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </os>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <cpu>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>on</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>off</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='maximumMigratable'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>on</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>off</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <vendor>AMD</vendor>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='succor'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <mode name='custom' supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Denverton-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='auto-ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='auto-ibrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amd-psfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='stibp-always-on'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='EPYC-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-128'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-256'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx10-512'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='prefetchiti'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Haswell-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='KnightsMill'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512er'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512pf'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512er'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512pf'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tbm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fma4'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tbm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xop'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='amx-tile'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-bf16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-fp16'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bitalg'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrc'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fzrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='la57'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='taa-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xfd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SierraForest'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cmpccxadd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ifma'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cmpccxadd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fbsdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='fsrs'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ibrs-all'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mcdt-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pbrsb-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='psdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='serialize'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vaes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='hle'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='rtm'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512bw'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512cd'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512dq'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512f'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='avx512vl'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='invpcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pcid'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='pku'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='mpx'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='core-capability'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='split-lock-detect'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='cldemote'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='erms'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='gfni'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdir64b'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='movdiri'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='xsaves'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='athlon'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='athlon-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='core2duo'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='core2duo-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='coreduo'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='coreduo-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='n270'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='n270-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='ss'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='phenom'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <blockers model='phenom-v1'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnow'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <feature name='3dnowext'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </blockers>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </mode>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </cpu>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <memoryBacking supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <enum name='sourceType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>file</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>anonymous</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <value>memfd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </memoryBacking>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <devices>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <disk supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='diskDevice'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>disk</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>cdrom</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>floppy</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>lun</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='bus'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>ide</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>fdc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>scsi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>sata</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-non-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </disk>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <graphics supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vnc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>egl-headless</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dbus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </graphics>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <video supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='modelType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vga</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>cirrus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>none</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>bochs</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>ramfb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </video>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <hostdev supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='mode'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>subsystem</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='startupPolicy'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>default</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>mandatory</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>requisite</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>optional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='subsysType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pci</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>scsi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='capsType'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='pciBackend'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </hostdev>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <rng supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtio-non-transitional</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>random</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>egd</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>builtin</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </rng>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <filesystem supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='driverType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>path</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>handle</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>virtiofs</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </filesystem>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <tpm supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tpm-tis</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tpm-crb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>emulator</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>external</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendVersion'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>2.0</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </tpm>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <redirdev supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='bus'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>usb</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </redirdev>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <channel supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pty</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>unix</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </channel>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <crypto supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>qemu</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendModel'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>builtin</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </crypto>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <interface supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='backendType'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>default</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>passt</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </interface>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <panic supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='model'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>isa</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>hyperv</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </panic>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <console supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='type'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>null</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vc</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pty</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dev</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>file</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>pipe</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>stdio</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>udp</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tcp</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>unix</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>qemu-vdagent</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>dbus</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </console>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </devices>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <gic supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <genid supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <backup supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <async-teardown supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <ps2 supported='yes'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <sev supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <sgx supported='no'/>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <hyperv supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='features'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>relaxed</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vapic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>spinlocks</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vpindex</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>runtime</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>synic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>stimer</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>reset</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>vendor_id</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>frequencies</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>reenlightenment</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tlbflush</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>ipi</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>avic</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>emsr_bitmap</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>xmm_input</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <defaults>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </defaults>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </hyperv>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    <launchSecurity supported='yes'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      <enum name='sectype'>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:        <value>tdx</value>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:      </enum>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:    </launchSecurity>
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  </features>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: </domainCapabilities>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.469 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.469 225819 INFO nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Secure Boot support detected#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.472 225819 INFO nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.472 225819 INFO nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.496 225819 DEBUG nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 01:43:20 np0005539509 nova_compute[225815]:  <model>Nehalem</model>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: </cpu>
Nov 29 01:43:20 np0005539509 nova_compute[225815]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.499 225819 DEBUG nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 29 01:43:20 np0005539509 nova_compute[225815]: 2025-11-29 06:43:20.636 225819 DEBUG nova.virt.libvirt.volume.mount [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 29 01:43:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:21 np0005539509 podman[226114]: 2025-11-29 06:43:21.361479408 +0000 UTC m=+0.098659186 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 29 01:43:21 np0005539509 nova_compute[225815]: 2025-11-29 06:43:21.394 225819 INFO nova.virt.node [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Determined node identity 774921e7-1fd5-4281-8c90-f7cd3ee5e01b from /var/lib/nova/compute_id#033[00m
Nov 29 01:43:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:21.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:22 np0005539509 nova_compute[225815]: 2025-11-29 06:43:22.028 225819 DEBUG nova.compute.manager [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Verified node 774921e7-1fd5-4281-8c90-f7cd3ee5e01b matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Nov 29 01:43:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:22.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:23.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:24 np0005539509 nova_compute[225815]: 2025-11-29 06:43:24.070 225819 INFO nova.compute.manager [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 29 01:43:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:24.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:24 np0005539509 nova_compute[225815]: 2025-11-29 06:43:24.588 225819 ERROR nova.compute.manager [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Could not retrieve compute node resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '774921e7-1fd5-4281-8c90-f7cd3ee5e01b' not found: No resource provider with uuid 774921e7-1fd5-4281-8c90-f7cd3ee5e01b found  ", "request_id": "req-294ae9c0-9b11-49c9-9c76-3c1c56b62cdb"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '774921e7-1fd5-4281-8c90-f7cd3ee5e01b' not found: No resource provider with uuid 774921e7-1fd5-4281-8c90-f7cd3ee5e01b found  ", "request_id": "req-294ae9c0-9b11-49c9-9c76-3c1c56b62cdb"}]}#033[00m
Nov 29 01:43:24 np0005539509 nova_compute[225815]: 2025-11-29 06:43:24.987 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:24 np0005539509 nova_compute[225815]: 2025-11-29 06:43:24.988 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:24 np0005539509 nova_compute[225815]: 2025-11-29 06:43:24.988 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:24 np0005539509 nova_compute[225815]: 2025-11-29 06:43:24.988 225819 DEBUG nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:43:24 np0005539509 nova_compute[225815]: 2025-11-29 06:43:24.989 225819 DEBUG oslo_concurrency.processutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:43:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:25.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:25 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:43:25 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3359582458' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:43:25 np0005539509 nova_compute[225815]: 2025-11-29 06:43:25.470 225819 DEBUG oslo_concurrency.processutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:43:25 np0005539509 nova_compute[225815]: 2025-11-29 06:43:25.620 225819 WARNING nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:43:25 np0005539509 nova_compute[225815]: 2025-11-29 06:43:25.621 225819 DEBUG nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5302MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:43:25 np0005539509 nova_compute[225815]: 2025-11-29 06:43:25.621 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:25 np0005539509 nova_compute[225815]: 2025-11-29 06:43:25.621 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:26.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:26 np0005539509 nova_compute[225815]: 2025-11-29 06:43:26.229 225819 ERROR nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '774921e7-1fd5-4281-8c90-f7cd3ee5e01b' not found: No resource provider with uuid 774921e7-1fd5-4281-8c90-f7cd3ee5e01b found  ", "request_id": "req-20388899-591a-493f-912c-230384eb308c"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '774921e7-1fd5-4281-8c90-f7cd3ee5e01b' not found: No resource provider with uuid 774921e7-1fd5-4281-8c90-f7cd3ee5e01b found  ", "request_id": "req-20388899-591a-493f-912c-230384eb308c"}]}#033[00m
Nov 29 01:43:26 np0005539509 nova_compute[225815]: 2025-11-29 06:43:26.229 225819 DEBUG nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:43:26 np0005539509 nova_compute[225815]: 2025-11-29 06:43:26.230 225819 DEBUG nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:43:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:26 np0005539509 nova_compute[225815]: 2025-11-29 06:43:26.412 225819 INFO nova.scheduler.client.report [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] [req-6d2ae49c-8926-4686-92df-ac92fd289c8a] Created resource provider record via placement API for resource provider with UUID 774921e7-1fd5-4281-8c90-f7cd3ee5e01b and name compute-1.ctlplane.example.com.#033[00m
Nov 29 01:43:26 np0005539509 nova_compute[225815]: 2025-11-29 06:43:26.450 225819 DEBUG oslo_concurrency.processutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:43:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:43:26 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/570139792' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:43:26 np0005539509 nova_compute[225815]: 2025-11-29 06:43:26.896 225819 DEBUG oslo_concurrency.processutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:43:26 np0005539509 nova_compute[225815]: 2025-11-29 06:43:26.902 225819 DEBUG nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 29 01:43:26 np0005539509 nova_compute[225815]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 29 01:43:26 np0005539509 nova_compute[225815]: 2025-11-29 06:43:26.903 225819 INFO nova.virt.libvirt.host [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 29 01:43:26 np0005539509 nova_compute[225815]: 2025-11-29 06:43:26.904 225819 DEBUG nova.compute.provider_tree [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Updating inventory in ProviderTree for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:43:26 np0005539509 nova_compute[225815]: 2025-11-29 06:43:26.905 225819 DEBUG nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:43:26 np0005539509 nova_compute[225815]: 2025-11-29 06:43:26.910 225819 DEBUG nova.virt.libvirt.driver [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Libvirt baseline CPU <cpu>
Nov 29 01:43:26 np0005539509 nova_compute[225815]:  <arch>x86_64</arch>
Nov 29 01:43:26 np0005539509 nova_compute[225815]:  <model>Nehalem</model>
Nov 29 01:43:26 np0005539509 nova_compute[225815]:  <vendor>AMD</vendor>
Nov 29 01:43:26 np0005539509 nova_compute[225815]:  <topology sockets="8" cores="1" threads="1"/>
Nov 29 01:43:26 np0005539509 nova_compute[225815]: </cpu>
Nov 29 01:43:26 np0005539509 nova_compute[225815]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 29 01:43:27 np0005539509 podman[226186]: 2025-11-29 06:43:27.330010481 +0000 UTC m=+0.061837446 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 29 01:43:27 np0005539509 podman[226185]: 2025-11-29 06:43:27.363131978 +0000 UTC m=+0.092499135 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:43:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:27.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:27 np0005539509 nova_compute[225815]: 2025-11-29 06:43:27.469 225819 DEBUG nova.scheduler.client.report [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Updated inventory for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 29 01:43:27 np0005539509 nova_compute[225815]: 2025-11-29 06:43:27.470 225819 DEBUG nova.compute.provider_tree [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Updating resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 29 01:43:27 np0005539509 nova_compute[225815]: 2025-11-29 06:43:27.470 225819 DEBUG nova.compute.provider_tree [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Updating inventory in ProviderTree for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:43:27 np0005539509 nova_compute[225815]: 2025-11-29 06:43:27.719 225819 DEBUG nova.compute.provider_tree [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Updating resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 29 01:43:27 np0005539509 nova_compute[225815]: 2025-11-29 06:43:27.855 225819 DEBUG nova.compute.resource_tracker [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:43:27 np0005539509 nova_compute[225815]: 2025-11-29 06:43:27.855 225819 DEBUG oslo_concurrency.lockutils [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:27 np0005539509 nova_compute[225815]: 2025-11-29 06:43:27.856 225819 DEBUG nova.service [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 29 01:43:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:28.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:28 np0005539509 nova_compute[225815]: 2025-11-29 06:43:28.636 225819 DEBUG nova.service [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 29 01:43:28 np0005539509 nova_compute[225815]: 2025-11-29 06:43:28.637 225819 DEBUG nova.servicegroup.drivers.db [None req-4f6496d9-cf62-4d7e-bdf5-5b6f856a2b28 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 29 01:43:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:29.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:30.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:31.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:43:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:32.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:43:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:33.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:34.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:35.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:36.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:37.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:38.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:39.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:40.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:41.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:42.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:43.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:43 np0005539509 nova_compute[225815]: 2025-11-29 06:43:43.640 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:43 np0005539509 nova_compute[225815]: 2025-11-29 06:43:43.819 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:44.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:45.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:46.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:47.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:48.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:49.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:50.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:51.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:52.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:52 np0005539509 podman[226221]: 2025-11-29 06:43:52.382515134 +0000 UTC m=+0.110752562 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:43:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:53.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:54.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:55.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:43:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:56.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:43:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:57.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:58.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:58 np0005539509 podman[226249]: 2025-11-29 06:43:58.331466562 +0000 UTC m=+0.065536327 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 29 01:43:58 np0005539509 podman[226248]: 2025-11-29 06:43:58.349342318 +0000 UTC m=+0.084410981 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:43:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:43:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:59.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:00.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:01 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:01.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:02.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:03.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:04.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:04 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:44:04 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:44:04 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:44:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:05.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:06.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:07.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:08.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:09.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:10.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:11 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:11.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:44:12 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:44:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:12.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:13.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:13 np0005539509 nova_compute[225815]: 2025-11-29 06:44:13.969 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:13 np0005539509 nova_compute[225815]: 2025-11-29 06:44:13.970 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:13 np0005539509 nova_compute[225815]: 2025-11-29 06:44:13.970 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:44:13 np0005539509 nova_compute[225815]: 2025-11-29 06:44:13.970 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:44:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:14.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:15.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:44:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:44:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:44:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:16.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:17 np0005539509 nova_compute[225815]: 2025-11-29 06:44:17.438 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.79 sec#033[00m
Nov 29 01:44:17 np0005539509 nova_compute[225815]: 2025-11-29 06:44:17.458 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:44:17 np0005539509 nova_compute[225815]: 2025-11-29 06:44:17.458 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539509 nova_compute[225815]: 2025-11-29 06:44:17.459 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539509 nova_compute[225815]: 2025-11-29 06:44:17.460 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539509 nova_compute[225815]: 2025-11-29 06:44:17.460 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539509 nova_compute[225815]: 2025-11-29 06:44:17.461 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539509 nova_compute[225815]: 2025-11-29 06:44:17.461 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539509 nova_compute[225815]: 2025-11-29 06:44:17.462 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:44:17 np0005539509 nova_compute[225815]: 2025-11-29 06:44:17.462 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:17.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:18.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:19.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:20.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:21.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:22.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:23 np0005539509 podman[226468]: 2025-11-29 06:44:23.396177547 +0000 UTC m=+0.134508903 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:44:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:23.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:24.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:25.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:26.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:27.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:28.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:29 np0005539509 podman[226497]: 2025-11-29 06:44:29.34131914 +0000 UTC m=+0.072476628 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:44:29 np0005539509 podman[226496]: 2025-11-29 06:44:29.345979235 +0000 UTC m=+0.082721684 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:44:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:29.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:30.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:31.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:32.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:33.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:34.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:35.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:36 np0005539509 nova_compute[225815]: 2025-11-29 06:44:36.075 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:36 np0005539509 nova_compute[225815]: 2025-11-29 06:44:36.075 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:36 np0005539509 nova_compute[225815]: 2025-11-29 06:44:36.076 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:36 np0005539509 nova_compute[225815]: 2025-11-29 06:44:36.076 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:44:36 np0005539509 nova_compute[225815]: 2025-11-29 06:44:36.076 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:44:36 np0005539509 nova_compute[225815]: 2025-11-29 06:44:36.100 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 8.66 sec#033[00m
Nov 29 01:44:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:36.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:44:36 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2284150235' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:44:36 np0005539509 nova_compute[225815]: 2025-11-29 06:44:36.547 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:44:36 np0005539509 nova_compute[225815]: 2025-11-29 06:44:36.761 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:44:36 np0005539509 nova_compute[225815]: 2025-11-29 06:44:36.763 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5347MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:44:36 np0005539509 nova_compute[225815]: 2025-11-29 06:44:36.764 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:36 np0005539509 nova_compute[225815]: 2025-11-29 06:44:36.765 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:37.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:38.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:39 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:44:39 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4273924820' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:44:39 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:44:39 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4273924820' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:44:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:39.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:40.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:40 np0005539509 nova_compute[225815]: 2025-11-29 06:44:40.465 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:44:40 np0005539509 nova_compute[225815]: 2025-11-29 06:44:40.466 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:44:40 np0005539509 nova_compute[225815]: 2025-11-29 06:44:40.518 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.937957) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680938056, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2761, "num_deletes": 509, "total_data_size": 6471047, "memory_usage": 6558432, "flush_reason": "Manual Compaction"}
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/945539473' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:44:40 np0005539509 nova_compute[225815]: 2025-11-29 06:44:40.966 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680971582, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 4239117, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15259, "largest_seqno": 18015, "table_properties": {"data_size": 4228438, "index_size": 6469, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3141, "raw_key_size": 23481, "raw_average_key_size": 18, "raw_value_size": 4205271, "raw_average_value_size": 3380, "num_data_blocks": 289, "num_entries": 1244, "num_filter_entries": 1244, "num_deletions": 509, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398425, "oldest_key_time": 1764398425, "file_creation_time": 1764398680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 33688 microseconds, and 14984 cpu microseconds.
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.971644) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 4239117 bytes OK
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.971663) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.974319) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.974346) EVENT_LOG_v1 {"time_micros": 1764398680974340, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.974365) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 6458171, prev total WAL file size 6458171, number of live WAL files 2.
Nov 29 01:44:40 np0005539509 nova_compute[225815]: 2025-11-29 06:44:40.975 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.976778) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323535' seq:0, type:0; will stop at (end)
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(4139KB)], [30(9302KB)]
Nov 29 01:44:40 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680976866, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 13764972, "oldest_snapshot_seqno": -1}
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4723 keys, 11178332 bytes, temperature: kUnknown
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681050359, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11178332, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11142868, "index_size": 22554, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11845, "raw_key_size": 118497, "raw_average_key_size": 25, "raw_value_size": 11053537, "raw_average_value_size": 2340, "num_data_blocks": 935, "num_entries": 4723, "num_filter_entries": 4723, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.050717) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11178332 bytes
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.052885) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.0 rd, 151.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 9.1 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(5.9) write-amplify(2.6) OK, records in: 5757, records dropped: 1034 output_compression: NoCompression
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.052914) EVENT_LOG_v1 {"time_micros": 1764398681052903, "job": 16, "event": "compaction_finished", "compaction_time_micros": 73621, "compaction_time_cpu_micros": 27643, "output_level": 6, "num_output_files": 1, "total_output_size": 11178332, "num_input_records": 5757, "num_output_records": 4723, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681053720, "job": 16, "event": "table_file_deletion", "file_number": 32}
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681055825, "job": 16, "event": "table_file_deletion", "file_number": 30}
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:40.976646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.055941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.055951) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.055955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.055959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:44:41.055962) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539509 nova_compute[225815]: 2025-11-29 06:44:41.287 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:44:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:41.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:42.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:43.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:44.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:45.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:46.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:46 np0005539509 nova_compute[225815]: 2025-11-29 06:44:46.818 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:44:46 np0005539509 nova_compute[225815]: 2025-11-29 06:44:46.819 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 10.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:47.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:48.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:49.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:50.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:51.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:52.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:53.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:54.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:54 np0005539509 podman[226576]: 2025-11-29 06:44:54.374892613 +0000 UTC m=+0.116178468 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:44:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:55.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:56.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:57.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:58.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:44:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:59.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:00.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:00 np0005539509 podman[226602]: 2025-11-29 06:45:00.316645186 +0000 UTC m=+0.053582138 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 01:45:00 np0005539509 podman[226601]: 2025-11-29 06:45:00.342591346 +0000 UTC m=+0.082144419 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible)
Nov 29 01:45:01 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:01.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:02.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.882140) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702882266, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 458, "num_deletes": 251, "total_data_size": 609015, "memory_usage": 618624, "flush_reason": "Manual Compaction"}
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702887441, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 401811, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18020, "largest_seqno": 18473, "table_properties": {"data_size": 399311, "index_size": 600, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6105, "raw_average_key_size": 18, "raw_value_size": 394334, "raw_average_value_size": 1209, "num_data_blocks": 28, "num_entries": 326, "num_filter_entries": 326, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398682, "oldest_key_time": 1764398682, "file_creation_time": 1764398702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 5431 microseconds, and 1922 cpu microseconds.
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.887584) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 401811 bytes OK
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.887632) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889239) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889256) EVENT_LOG_v1 {"time_micros": 1764398702889252, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889273) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 606194, prev total WAL file size 606194, number of live WAL files 2.
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.890121) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(392KB)], [33(10MB)]
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702890232, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 11580143, "oldest_snapshot_seqno": -1}
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4538 keys, 9460877 bytes, temperature: kUnknown
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702984972, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 9460877, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9428135, "index_size": 20280, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11397, "raw_key_size": 115324, "raw_average_key_size": 25, "raw_value_size": 9343388, "raw_average_value_size": 2058, "num_data_blocks": 832, "num_entries": 4538, "num_filter_entries": 4538, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.985453) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 9460877 bytes
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.987501) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 122.1 rd, 99.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 10.7 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(52.4) write-amplify(23.5) OK, records in: 5049, records dropped: 511 output_compression: NoCompression
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.987517) EVENT_LOG_v1 {"time_micros": 1764398702987510, "job": 18, "event": "compaction_finished", "compaction_time_micros": 94872, "compaction_time_cpu_micros": 24867, "output_level": 6, "num_output_files": 1, "total_output_size": 9460877, "num_input_records": 5049, "num_output_records": 4538, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702987678, "job": 18, "event": "table_file_deletion", "file_number": 35}
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702989534, "job": 18, "event": "table_file_deletion", "file_number": 33}
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889949) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.989634) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.989641) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.989644) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.989646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:02 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:45:02.989649) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:03.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:04.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:05.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:06.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:07.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:08.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:09.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:10.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:11 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:11.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:12.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:45:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:45:13 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:45:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:13.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:14.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:15.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:45:15.912 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:45:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:45:15.913 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:45:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:45:15.913 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:45:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:16.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:17.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:18.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:19.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:20.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:21.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:22.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:23.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:24.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:45:24 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:45:25 np0005539509 podman[226830]: 2025-11-29 06:45:25.37498302 +0000 UTC m=+0.106182948 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:45:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:25.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:26.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:27.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:28.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:29.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:30.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:31 np0005539509 podman[226855]: 2025-11-29 06:45:31.332874827 +0000 UTC m=+0.066170568 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 01:45:31 np0005539509 podman[226854]: 2025-11-29 06:45:31.332878757 +0000 UTC m=+0.071472201 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:45:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:31.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:32.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:33.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:34.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:35.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:36.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:37.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:38.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:39.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:40.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:41.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:42.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:43.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:44.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:45.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:46.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:46 np0005539509 nova_compute[225815]: 2025-11-29 06:45:46.811 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:46 np0005539509 nova_compute[225815]: 2025-11-29 06:45:46.812 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:47.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:48.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:49.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:50.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:51.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:52.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:52 np0005539509 nova_compute[225815]: 2025-11-29 06:45:52.887 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:52 np0005539509 nova_compute[225815]: 2025-11-29 06:45:52.887 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:45:52 np0005539509 nova_compute[225815]: 2025-11-29 06:45:52.888 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:45:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:53.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:54.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:55 np0005539509 nova_compute[225815]: 2025-11-29 06:45:55.563 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:45:55 np0005539509 nova_compute[225815]: 2025-11-29 06:45:55.563 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539509 nova_compute[225815]: 2025-11-29 06:45:55.563 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539509 nova_compute[225815]: 2025-11-29 06:45:55.564 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539509 nova_compute[225815]: 2025-11-29 06:45:55.564 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539509 nova_compute[225815]: 2025-11-29 06:45:55.564 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539509 nova_compute[225815]: 2025-11-29 06:45:55.564 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539509 nova_compute[225815]: 2025-11-29 06:45:55.564 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:45:55 np0005539509 nova_compute[225815]: 2025-11-29 06:45:55.565 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:55.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:56 np0005539509 podman[226892]: 2025-11-29 06:45:56.362584674 +0000 UTC m=+0.098918952 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:45:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:56.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:57.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:58.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:45:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:59.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:00.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:01 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:01.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:02 np0005539509 podman[226920]: 2025-11-29 06:46:02.334725556 +0000 UTC m=+0.070166115 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:46:02 np0005539509 podman[226919]: 2025-11-29 06:46:02.352536787 +0000 UTC m=+0.084915814 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:46:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:02.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:03.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:04.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:05.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:06.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:07.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:08.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:09 np0005539509 nova_compute[225815]: 2025-11-29 06:46:09.008 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:09 np0005539509 nova_compute[225815]: 2025-11-29 06:46:09.009 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:09 np0005539509 nova_compute[225815]: 2025-11-29 06:46:09.009 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:09 np0005539509 nova_compute[225815]: 2025-11-29 06:46:09.010 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:46:09 np0005539509 nova_compute[225815]: 2025-11-29 06:46:09.010 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:46:09 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:46:09 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3671393982' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:46:09 np0005539509 nova_compute[225815]: 2025-11-29 06:46:09.470 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:46:09 np0005539509 nova_compute[225815]: 2025-11-29 06:46:09.554 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.37 sec#033[00m
Nov 29 01:46:09 np0005539509 nova_compute[225815]: 2025-11-29 06:46:09.687 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:46:09 np0005539509 nova_compute[225815]: 2025-11-29 06:46:09.688 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5358MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:46:09 np0005539509 nova_compute[225815]: 2025-11-29 06:46:09.689 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:09 np0005539509 nova_compute[225815]: 2025-11-29 06:46:09.689 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:09.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:10.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:10 np0005539509 nova_compute[225815]: 2025-11-29 06:46:10.600 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:46:10 np0005539509 nova_compute[225815]: 2025-11-29 06:46:10.601 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:46:10 np0005539509 nova_compute[225815]: 2025-11-29 06:46:10.647 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:46:11 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:46:11 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1964139985' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:46:11 np0005539509 nova_compute[225815]: 2025-11-29 06:46:11.288 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.641s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:46:11 np0005539509 nova_compute[225815]: 2025-11-29 06:46:11.295 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:46:11 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:11.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:12.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:12 np0005539509 nova_compute[225815]: 2025-11-29 06:46:12.441 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:46:12 np0005539509 nova_compute[225815]: 2025-11-29 06:46:12.443 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:46:12 np0005539509 nova_compute[225815]: 2025-11-29 06:46:12.444 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:13.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:14.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:15.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:46:15.913 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:46:15.914 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:46:15.914 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:16.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:17.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:18.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:19.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:20.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:21.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:22.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:23.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:24.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:46:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:46:25 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:46:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:25.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:26.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:27 np0005539509 podman[227133]: 2025-11-29 06:46:27.370282977 +0000 UTC m=+0.104059475 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 01:46:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:27.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:28.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:29.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:30.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:31.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:32.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:33 np0005539509 podman[227185]: 2025-11-29 06:46:33.078809299 +0000 UTC m=+0.057922131 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 29 01:46:33 np0005539509 podman[227186]: 2025-11-29 06:46:33.086771951 +0000 UTC m=+0.061043733 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:46:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:33.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:33 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:46:33 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:46:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:34.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:35 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:46:35.706 139246 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:05:03', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'aa:d2:09:dd:a5:e1'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:46:35 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:46:35.708 139246 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:46:35 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:46:35.709 139246 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2fa83236-07b6-4ff7-bb56-9f4f13bed719, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:46:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:35.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:36.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:37.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:38.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:39.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:40.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:41.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:42.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:43.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:44.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:45.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:46.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:47.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:48.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:49.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:50.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:51.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:52.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:53.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:54.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:55.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:56.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:57.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:58 np0005539509 podman[227249]: 2025-11-29 06:46:58.363396504 +0000 UTC m=+0.089220657 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 01:46:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:58.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:46:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:46:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:59.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:47:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:00.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:01 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:01.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:02.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:03 np0005539509 podman[227275]: 2025-11-29 06:47:03.328336487 +0000 UTC m=+0.059671897 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:47:03 np0005539509 podman[227276]: 2025-11-29 06:47:03.354688172 +0000 UTC m=+0.079903408 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:47:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:03.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:04.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:05.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:06.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:07.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:08.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:09.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:10.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:11.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:11 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:12 np0005539509 nova_compute[225815]: 2025-11-29 06:47:12.446 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:12 np0005539509 nova_compute[225815]: 2025-11-29 06:47:12.446 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:12 np0005539509 nova_compute[225815]: 2025-11-29 06:47:12.447 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:47:12 np0005539509 nova_compute[225815]: 2025-11-29 06:47:12.447 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:47:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:12.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:13 np0005539509 nova_compute[225815]: 2025-11-29 06:47:13.712 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:47:13 np0005539509 nova_compute[225815]: 2025-11-29 06:47:13.712 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539509 nova_compute[225815]: 2025-11-29 06:47:13.714 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539509 nova_compute[225815]: 2025-11-29 06:47:13.714 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539509 nova_compute[225815]: 2025-11-29 06:47:13.714 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539509 nova_compute[225815]: 2025-11-29 06:47:13.715 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539509 nova_compute[225815]: 2025-11-29 06:47:13.715 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539509 nova_compute[225815]: 2025-11-29 06:47:13.715 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:47:13 np0005539509 nova_compute[225815]: 2025-11-29 06:47:13.715 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:13.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:14 np0005539509 nova_compute[225815]: 2025-11-29 06:47:14.056 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:14 np0005539509 nova_compute[225815]: 2025-11-29 06:47:14.057 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:14 np0005539509 nova_compute[225815]: 2025-11-29 06:47:14.057 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:14 np0005539509 nova_compute[225815]: 2025-11-29 06:47:14.058 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:47:14 np0005539509 nova_compute[225815]: 2025-11-29 06:47:14.058 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:47:14 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2186349468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:47:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:14.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:14 np0005539509 nova_compute[225815]: 2025-11-29 06:47:14.527 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:14 np0005539509 nova_compute[225815]: 2025-11-29 06:47:14.686 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:47:14 np0005539509 nova_compute[225815]: 2025-11-29 06:47:14.687 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5369MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:47:14 np0005539509 nova_compute[225815]: 2025-11-29 06:47:14.688 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:14 np0005539509 nova_compute[225815]: 2025-11-29 06:47:14.688 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:15.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:15 np0005539509 nova_compute[225815]: 2025-11-29 06:47:15.866 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:47:15 np0005539509 nova_compute[225815]: 2025-11-29 06:47:15.867 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:47:15 np0005539509 nova_compute[225815]: 2025-11-29 06:47:15.887 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:47:15.915 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:47:15.916 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:47:15.916 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:47:16 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1939421518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:47:16 np0005539509 nova_compute[225815]: 2025-11-29 06:47:16.405 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:16 np0005539509 nova_compute[225815]: 2025-11-29 06:47:16.414 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:47:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:16.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:16 np0005539509 nova_compute[225815]: 2025-11-29 06:47:16.546 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:47:16 np0005539509 nova_compute[225815]: 2025-11-29 06:47:16.548 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:47:16 np0005539509 nova_compute[225815]: 2025-11-29 06:47:16.548 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:17.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.064 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.065 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.234 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.235 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.235 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.251 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.251 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.251 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.252 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.252 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.252 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.252 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.252 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.379 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.380 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.380 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.380 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.381 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:18.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:18 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:47:18 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1602380716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:47:18 np0005539509 nova_compute[225815]: 2025-11-29 06:47:18.826 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:19 np0005539509 nova_compute[225815]: 2025-11-29 06:47:19.013 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:47:19 np0005539509 nova_compute[225815]: 2025-11-29 06:47:19.014 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5372MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:47:19 np0005539509 nova_compute[225815]: 2025-11-29 06:47:19.014 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:19 np0005539509 nova_compute[225815]: 2025-11-29 06:47:19.014 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:19 np0005539509 nova_compute[225815]: 2025-11-29 06:47:19.497 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:47:19 np0005539509 nova_compute[225815]: 2025-11-29 06:47:19.498 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:47:19 np0005539509 nova_compute[225815]: 2025-11-29 06:47:19.518 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:19.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:19 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:47:19 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2414150286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:47:19 np0005539509 nova_compute[225815]: 2025-11-29 06:47:19.994 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:20 np0005539509 nova_compute[225815]: 2025-11-29 06:47:20.001 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:47:20 np0005539509 nova_compute[225815]: 2025-11-29 06:47:20.473 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:47:20 np0005539509 nova_compute[225815]: 2025-11-29 06:47:20.474 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:47:20 np0005539509 nova_compute[225815]: 2025-11-29 06:47:20.474 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.460s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:20.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:21 np0005539509 nova_compute[225815]: 2025-11-29 06:47:21.189 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:21.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:22.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:23.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:24.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:25.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:26.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:27.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:28.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:29 np0005539509 podman[227407]: 2025-11-29 06:47:29.334369411 +0000 UTC m=+0.079348763 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:47:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:29.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:30.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:31.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:32.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:33 np0005539509 podman[227482]: 2025-11-29 06:47:33.491165855 +0000 UTC m=+0.077761282 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 01:47:33 np0005539509 podman[227483]: 2025-11-29 06:47:33.499454167 +0000 UTC m=+0.086790523 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 29 01:47:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:33.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:34.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:47:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:47:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:47:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:47:35 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:47:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:35.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:36.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:37.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:38.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:39.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:40.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:41.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:42.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:43 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:47:43 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:47:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:43.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:44.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:45.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:46.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:47.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:48.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:49.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:50.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:51.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:52.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:53.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:54.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:55.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:56.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.077007) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877077087, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1876, "num_deletes": 251, "total_data_size": 4555047, "memory_usage": 4605016, "flush_reason": "Manual Compaction"}
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877091720, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1737351, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18479, "largest_seqno": 20349, "table_properties": {"data_size": 1731823, "index_size": 2668, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14351, "raw_average_key_size": 20, "raw_value_size": 1719513, "raw_average_value_size": 2428, "num_data_blocks": 123, "num_entries": 708, "num_filter_entries": 708, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398703, "oldest_key_time": 1764398703, "file_creation_time": 1764398877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 14781 microseconds, and 6320 cpu microseconds.
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.091787) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1737351 bytes OK
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.091810) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.093739) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.093759) EVENT_LOG_v1 {"time_micros": 1764398877093754, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.093778) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 4546656, prev total WAL file size 4546656, number of live WAL files 2.
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.095146) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353033' seq:72057594037927935, type:22 .. '6D67727374617400373535' seq:0, type:0; will stop at (end)
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1696KB)], [36(9239KB)]
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877095208, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 11198228, "oldest_snapshot_seqno": -1}
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4812 keys, 8586310 bytes, temperature: kUnknown
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877163574, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 8586310, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8554107, "index_size": 19101, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12037, "raw_key_size": 121248, "raw_average_key_size": 25, "raw_value_size": 8466904, "raw_average_value_size": 1759, "num_data_blocks": 783, "num_entries": 4812, "num_filter_entries": 4812, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.163893) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 8586310 bytes
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.165342) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 163.5 rd, 125.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 9.0 +0.0 blob) out(8.2 +0.0 blob), read-write-amplify(11.4) write-amplify(4.9) OK, records in: 5246, records dropped: 434 output_compression: NoCompression
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.165359) EVENT_LOG_v1 {"time_micros": 1764398877165350, "job": 20, "event": "compaction_finished", "compaction_time_micros": 68480, "compaction_time_cpu_micros": 27997, "output_level": 6, "num_output_files": 1, "total_output_size": 8586310, "num_input_records": 5246, "num_output_records": 4812, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877165718, "job": 20, "event": "table_file_deletion", "file_number": 38}
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877167342, "job": 20, "event": "table_file_deletion", "file_number": 36}
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.095054) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.167443) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.167449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.167451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.167453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:57 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:47:57.167455) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:57.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:58.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:47:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:59.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:00 np0005539509 podman[227656]: 2025-11-29 06:48:00.384264309 +0000 UTC m=+0.113395574 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:48:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:00.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:01 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:01.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:02.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:03.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:04 np0005539509 podman[227682]: 2025-11-29 06:48:04.322991971 +0000 UTC m=+0.066906101 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 01:48:04 np0005539509 podman[227683]: 2025-11-29 06:48:04.333691797 +0000 UTC m=+0.073913109 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:48:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:04.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:05.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:06 np0005539509 nova_compute[225815]: 2025-11-29 06:48:06.076 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 16.51 sec#033[00m
Nov 29 01:48:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:06.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:07.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:08.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:09.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:10.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:11 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:11.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:12.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:13.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:13 np0005539509 nova_compute[225815]: 2025-11-29 06:48:13.967 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:13 np0005539509 nova_compute[225815]: 2025-11-29 06:48:13.968 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:48:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:14.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:15.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:48:15.917 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:48:15.918 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:48:15.918 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:16.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:17.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:18 np0005539509 nova_compute[225815]: 2025-11-29 06:48:18.021 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 1.94 sec#033[00m
Nov 29 01:48:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:18.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:19.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:20.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:21.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:22.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:23.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:24.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:25.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:26.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:27.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:28.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:28 np0005539509 nova_compute[225815]: 2025-11-29 06:48:28.745 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:48:28 np0005539509 nova_compute[225815]: 2025-11-29 06:48:28.749 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:28 np0005539509 nova_compute[225815]: 2025-11-29 06:48:28.749 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:48:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:29.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:30.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:31 np0005539509 podman[227719]: 2025-11-29 06:48:31.369429556 +0000 UTC m=+0.103866609 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:48:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:31.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:32.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:33.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:34.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:35 np0005539509 podman[227747]: 2025-11-29 06:48:35.34526085 +0000 UTC m=+0.078117861 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 01:48:35 np0005539509 podman[227748]: 2025-11-29 06:48:35.345394064 +0000 UTC m=+0.078414039 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 01:48:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:35.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:48:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 6783 writes, 26K keys, 6783 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 6783 writes, 1386 syncs, 4.89 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 499 writes, 770 keys, 499 commit groups, 1.0 writes per commit group, ingest: 0.25 MB, 0.00 MB/s#012Interval WAL: 499 writes, 242 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 01:48:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:36.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:37.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:38.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:39.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:40.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:41 np0005539509 nova_compute[225815]: 2025-11-29 06:48:41.775 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 13.75 sec#033[00m
Nov 29 01:48:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:41.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:42.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:43.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:44 np0005539509 podman[227961]: 2025-11-29 06:48:44.104339756 +0000 UTC m=+0.603869623 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default)
Nov 29 01:48:44 np0005539509 podman[227961]: 2025-11-29 06:48:44.202832961 +0000 UTC m=+0.702362728 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:48:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:44.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.078632) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925078713, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 711, "num_deletes": 251, "total_data_size": 1330634, "memory_usage": 1351040, "flush_reason": "Manual Compaction"}
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925110707, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 878455, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20354, "largest_seqno": 21060, "table_properties": {"data_size": 874937, "index_size": 1426, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7933, "raw_average_key_size": 19, "raw_value_size": 867897, "raw_average_value_size": 2121, "num_data_blocks": 62, "num_entries": 409, "num_filter_entries": 409, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398878, "oldest_key_time": 1764398878, "file_creation_time": 1764398925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 32130 microseconds, and 4359 cpu microseconds.
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.110772) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 878455 bytes OK
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.110806) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.116419) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.116438) EVENT_LOG_v1 {"time_micros": 1764398925116432, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.116456) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 1326846, prev total WAL file size 1326846, number of live WAL files 2.
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.117092) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(857KB)], [39(8385KB)]
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925117145, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 9464765, "oldest_snapshot_seqno": -1}
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 4703 keys, 7370713 bytes, temperature: kUnknown
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925158917, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 7370713, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7340287, "index_size": 17580, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11781, "raw_key_size": 119554, "raw_average_key_size": 25, "raw_value_size": 7255961, "raw_average_value_size": 1542, "num_data_blocks": 714, "num_entries": 4703, "num_filter_entries": 4703, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764398925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.159430) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 7370713 bytes
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.161058) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 225.1 rd, 175.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 8.2 +0.0 blob) out(7.0 +0.0 blob), read-write-amplify(19.2) write-amplify(8.4) OK, records in: 5221, records dropped: 518 output_compression: NoCompression
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.161078) EVENT_LOG_v1 {"time_micros": 1764398925161068, "job": 22, "event": "compaction_finished", "compaction_time_micros": 42039, "compaction_time_cpu_micros": 17607, "output_level": 6, "num_output_files": 1, "total_output_size": 7370713, "num_input_records": 5221, "num_output_records": 4703, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925162017, "job": 22, "event": "table_file_deletion", "file_number": 41}
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925164076, "job": 22, "event": "table_file_deletion", "file_number": 39}
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.116966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.164380) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.164386) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.164388) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.164390) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:48:45.164391) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:45.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:46 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:46 np0005539509 podman[228469]: 2025-11-29 06:48:46.561321511 +0000 UTC m=+0.028844703 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:48:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:46.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:46 np0005539509 podman[228469]: 2025-11-29 06:48:46.961674681 +0000 UTC m=+0.429197793 container create d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:48:47 np0005539509 systemd[1]: Started libpod-conmon-d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61.scope.
Nov 29 01:48:47 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:48:47 np0005539509 podman[228469]: 2025-11-29 06:48:47.312955177 +0000 UTC m=+0.780478369 container init d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:48:47 np0005539509 podman[228469]: 2025-11-29 06:48:47.326334175 +0000 UTC m=+0.793857317 container start d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True)
Nov 29 01:48:47 np0005539509 bold_cohen[228485]: 167 167
Nov 29 01:48:47 np0005539509 systemd[1]: libpod-d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61.scope: Deactivated successfully.
Nov 29 01:48:47 np0005539509 podman[228469]: 2025-11-29 06:48:47.343548195 +0000 UTC m=+0.811071377 container attach d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Nov 29 01:48:47 np0005539509 podman[228469]: 2025-11-29 06:48:47.344182343 +0000 UTC m=+0.811705475 container died d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:48:47 np0005539509 systemd[1]: var-lib-containers-storage-overlay-8113f39255a3fa0d837a24c602ef032e8f0e2d9c3f4a185599c6ed8ab4a68107-merged.mount: Deactivated successfully.
Nov 29 01:48:47 np0005539509 podman[228469]: 2025-11-29 06:48:47.831567951 +0000 UTC m=+1.299091093 container remove d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=bold_cohen, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Nov 29 01:48:47 np0005539509 systemd[1]: libpod-conmon-d0d07283abefcf8d89486d5363fc4245c946c044807c0c3395740521fab3af61.scope: Deactivated successfully.
Nov 29 01:48:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:47.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:48 np0005539509 podman[228509]: 2025-11-29 06:48:48.003004456 +0000 UTC m=+0.042176709 container create f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3)
Nov 29 01:48:48 np0005539509 systemd[1]: Started libpod-conmon-f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096.scope.
Nov 29 01:48:48 np0005539509 podman[228509]: 2025-11-29 06:48:47.984640615 +0000 UTC m=+0.023812698 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:48:48 np0005539509 systemd[1]: Started libcrun container.
Nov 29 01:48:48 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18aa92f6b0e2733f1717726aed8428950bd2876963bc14bef06892dfbce88a20/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:48:48 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18aa92f6b0e2733f1717726aed8428950bd2876963bc14bef06892dfbce88a20/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:48:48 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18aa92f6b0e2733f1717726aed8428950bd2876963bc14bef06892dfbce88a20/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:48:48 np0005539509 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18aa92f6b0e2733f1717726aed8428950bd2876963bc14bef06892dfbce88a20/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:48:48 np0005539509 podman[228509]: 2025-11-29 06:48:48.122977775 +0000 UTC m=+0.162149868 container init f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:48:48 np0005539509 podman[228509]: 2025-11-29 06:48:48.128907054 +0000 UTC m=+0.168079117 container start f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:48:48 np0005539509 podman[228509]: 2025-11-29 06:48:48.162956196 +0000 UTC m=+0.202128259 container attach f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Nov 29 01:48:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:48.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:49 np0005539509 strange_cannon[228525]: [
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:    {
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:        "available": false,
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:        "ceph_device": false,
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:        "device_id": "QEMU_DVD-ROM_QM00001",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:        "lsm_data": {},
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:        "lvs": [],
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:        "path": "/dev/sr0",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:        "rejected_reasons": [
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "Has a FileSystem",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "Insufficient space (<5GB)"
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:        ],
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:        "sys_api": {
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "actuators": null,
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "device_nodes": "sr0",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "devname": "sr0",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "human_readable_size": "482.00 KB",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "id_bus": "ata",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "model": "QEMU DVD-ROM",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "nr_requests": "2",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "parent": "/dev/sr0",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "partitions": {},
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "path": "/dev/sr0",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "removable": "1",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "rev": "2.5+",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "ro": "0",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "rotational": "1",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "sas_address": "",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "sas_device_handle": "",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "scheduler_mode": "mq-deadline",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "sectors": 0,
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "sectorsize": "2048",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "size": 493568.0,
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "support_discard": "2048",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "type": "disk",
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:            "vendor": "QEMU"
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:        }
Nov 29 01:48:49 np0005539509 strange_cannon[228525]:    }
Nov 29 01:48:49 np0005539509 strange_cannon[228525]: ]
Nov 29 01:48:49 np0005539509 systemd[1]: libpod-f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096.scope: Deactivated successfully.
Nov 29 01:48:49 np0005539509 systemd[1]: libpod-f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096.scope: Consumed 1.272s CPU time.
Nov 29 01:48:49 np0005539509 podman[228509]: 2025-11-29 06:48:49.375792839 +0000 UTC m=+1.414964932 container died f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:48:49 np0005539509 systemd[1]: var-lib-containers-storage-overlay-18aa92f6b0e2733f1717726aed8428950bd2876963bc14bef06892dfbce88a20-merged.mount: Deactivated successfully.
Nov 29 01:48:49 np0005539509 podman[228509]: 2025-11-29 06:48:49.537439492 +0000 UTC m=+1.576611555 container remove f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=strange_cannon, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:48:49 np0005539509 systemd[1]: libpod-conmon-f675f6b32c361d29daed2a8d177c5a0eb7928033925a415e0cf6fa5f65d7c096.scope: Deactivated successfully.
Nov 29 01:48:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:49.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:50 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:50 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:50 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:48:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:50.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:51 np0005539509 nova_compute[225815]: 2025-11-29 06:48:51.093 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:51 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:48:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:51.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:52.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:53.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:54.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:55 np0005539509 nova_compute[225815]: 2025-11-29 06:48:55.883 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 4.11 sec#033[00m
Nov 29 01:48:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:55.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:56.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:57.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:58.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:59 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:59 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:48:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:59.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:00.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:01 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:01.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:02 np0005539509 podman[229790]: 2025-11-29 06:49:02.424034041 +0000 UTC m=+0.156542589 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 01:49:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:49:02 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4223701543' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:49:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:49:02 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4223701543' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:49:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:02.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:02 np0005539509 nova_compute[225815]: 2025-11-29 06:49:02.831 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:02 np0005539509 nova_compute[225815]: 2025-11-29 06:49:02.831 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:02 np0005539509 nova_compute[225815]: 2025-11-29 06:49:02.832 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:49:02 np0005539509 nova_compute[225815]: 2025-11-29 06:49:02.832 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:49:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:03.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:04.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:05.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:06 np0005539509 podman[229818]: 2025-11-29 06:49:06.322038134 +0000 UTC m=+0.066257423 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 01:49:06 np0005539509 podman[229817]: 2025-11-29 06:49:06.341143945 +0000 UTC m=+0.079954239 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:49:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:06.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:07.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:08.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:09.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:10.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:11 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:12.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:12.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:14.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:14.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:49:15.918 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:49:15.919 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:49:15.919 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:16.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:16.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:18.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:18.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:49:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:20.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:49:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:20.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:21 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:49:21 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 3866 writes, 21K keys, 3866 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 3866 writes, 3866 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1409 writes, 7247 keys, 1409 commit groups, 1.0 writes per commit group, ingest: 14.93 MB, 0.02 MB/s#012Interval WAL: 1409 writes, 1409 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     50.1      0.52              0.09        11    0.048       0      0       0.0       0.0#012  L6      1/0    7.03 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    108.3     89.4      1.02              0.28        10    0.102     49K   5239       0.0       0.0#012 Sum      1/0    7.03 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     71.4     76.0      1.54              0.38        21    0.073     49K   5239       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.7    111.6    106.3      0.58              0.19        12    0.049     31K   3467       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    108.3     89.4      1.02              0.28        10    0.102     49K   5239       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     63.0      0.42              0.09        10    0.042       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.11              0.00         1    0.108       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.1 total, 600.0 interval#012Flush(GB): cumulative 0.026, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.11 GB write, 0.06 MB/s write, 0.11 GB read, 0.06 MB/s read, 1.5 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562155f711f0#2 capacity: 304.00 MB usage: 8.06 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(419,7.66 MB,2.51979%) FilterBlock(21,140.86 KB,0.0452493%) IndexBlock(21,272.67 KB,0.0875925%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 29 01:49:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:22.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:49:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:22.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:49:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:24.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:24 np0005539509 nova_compute[225815]: 2025-11-29 06:49:24.150 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 18.27 sec#033[00m
Nov 29 01:49:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:24.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:26.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:26.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:28.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:28.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:30.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:30.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:32.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:32.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:33 np0005539509 podman[229856]: 2025-11-29 06:49:33.375084259 +0000 UTC m=+0.102277167 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:49:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:34.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:34 np0005539509 nova_compute[225815]: 2025-11-29 06:49:34.164 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:49:34 np0005539509 nova_compute[225815]: 2025-11-29 06:49:34.165 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539509 nova_compute[225815]: 2025-11-29 06:49:34.165 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539509 nova_compute[225815]: 2025-11-29 06:49:34.165 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539509 nova_compute[225815]: 2025-11-29 06:49:34.165 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539509 nova_compute[225815]: 2025-11-29 06:49:34.166 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539509 nova_compute[225815]: 2025-11-29 06:49:34.166 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539509 nova_compute[225815]: 2025-11-29 06:49:34.166 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:49:34 np0005539509 nova_compute[225815]: 2025-11-29 06:49:34.166 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:34.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:36.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:36.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:37 np0005539509 podman[229884]: 2025-11-29 06:49:37.313772809 +0000 UTC m=+0.051551370 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:49:37 np0005539509 nova_compute[225815]: 2025-11-29 06:49:37.336 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.18 sec#033[00m
Nov 29 01:49:37 np0005539509 podman[229883]: 2025-11-29 06:49:37.341197282 +0000 UTC m=+0.083537495 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:49:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:38.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:38.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:40.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:40.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:42.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:42.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:44.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:44.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:46.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:46.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:48.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:48.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:50.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:50.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:51 np0005539509 nova_compute[225815]: 2025-11-29 06:49:51.117 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:51 np0005539509 nova_compute[225815]: 2025-11-29 06:49:51.117 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:51 np0005539509 nova_compute[225815]: 2025-11-29 06:49:51.117 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:51 np0005539509 nova_compute[225815]: 2025-11-29 06:49:51.118 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:49:51 np0005539509 nova_compute[225815]: 2025-11-29 06:49:51.118 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:49:51 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1398036893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:49:51 np0005539509 nova_compute[225815]: 2025-11-29 06:49:51.573 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:51 np0005539509 nova_compute[225815]: 2025-11-29 06:49:51.766 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:49:51 np0005539509 nova_compute[225815]: 2025-11-29 06:49:51.767 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5359MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:49:51 np0005539509 nova_compute[225815]: 2025-11-29 06:49:51.767 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:51 np0005539509 nova_compute[225815]: 2025-11-29 06:49:51.768 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:52.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:52.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:54.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:54.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:55 np0005539509 ceph-mgr[81116]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 01:49:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:56.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:56.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:58.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:49:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:58.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:00.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:00 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:00 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:00 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:00 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:00 np0005539509 ceph-mon[80754]: overall HEALTH_OK
Nov 29 01:50:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:00.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:01 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:02.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:02.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:04.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:04 np0005539509 podman[230197]: 2025-11-29 06:50:04.341609558 +0000 UTC m=+0.086811703 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:50:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:04.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:06.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:06.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:06 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:50:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:07 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:50:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:08.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:08 np0005539509 podman[230224]: 2025-11-29 06:50:08.307752732 +0000 UTC m=+0.048351574 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 01:50:08 np0005539509 podman[230223]: 2025-11-29 06:50:08.345959314 +0000 UTC m=+0.089465794 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 01:50:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:08.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:10.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:10.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:11 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:12.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:12.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:14.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:14.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:50:15.919 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:50:15.919 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:50:15.919 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:16.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:16.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:18.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:18.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:19 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:20.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:20 np0005539509 nova_compute[225815]: 2025-11-29 06:50:20.615 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 33.28 sec#033[00m
Nov 29 01:50:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:20.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:21 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:22.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:22.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:24.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:24.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:26.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:26.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:28.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:28.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:30.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:30.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:32.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:32.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:34.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:34.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:35 np0005539509 podman[230312]: 2025-11-29 06:50:35.370057203 +0000 UTC m=+0.102324638 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:50:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:36.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:36.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:38.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:38.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:39 np0005539509 podman[230338]: 2025-11-29 06:50:39.31981361 +0000 UTC m=+0.064070315 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:50:39 np0005539509 podman[230339]: 2025-11-29 06:50:39.331144083 +0000 UTC m=+0.067032854 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:50:39 np0005539509 nova_compute[225815]: 2025-11-29 06:50:39.592 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:50:39 np0005539509 nova_compute[225815]: 2025-11-29 06:50:39.594 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:50:39 np0005539509 nova_compute[225815]: 2025-11-29 06:50:39.747 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing inventories for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:50:39 np0005539509 nova_compute[225815]: 2025-11-29 06:50:39.773 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Updating ProviderTree inventory for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:50:39 np0005539509 nova_compute[225815]: 2025-11-29 06:50:39.773 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Updating inventory in ProviderTree for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:50:39 np0005539509 nova_compute[225815]: 2025-11-29 06:50:39.790 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing aggregate associations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:50:39 np0005539509 nova_compute[225815]: 2025-11-29 06:50:39.822 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing trait associations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSSE3,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:50:39 np0005539509 nova_compute[225815]: 2025-11-29 06:50:39.921 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:40.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:40 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:50:40 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4231777605' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:50:40 np0005539509 nova_compute[225815]: 2025-11-29 06:50:40.363 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:40 np0005539509 nova_compute[225815]: 2025-11-29 06:50:40.370 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:40.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:41 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:42.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:42.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:44.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:44.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:46.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:46.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:48.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:48.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:50.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:50.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:51 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:52.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:52.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:53 np0005539509 nova_compute[225815]: 2025-11-29 06:50:53.950 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 13.33 sec#033[00m
Nov 29 01:50:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:54.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:54.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:56.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:56.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:56 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:58.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:50:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:58.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:00.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:00.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:02.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:02 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Nov 29 01:51:02 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Nov 29 01:51:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:02 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Nov 29 01:51:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:02.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:03 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Nov 29 01:51:03 np0005539509 radosgw[83442]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Nov 29 01:51:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:04.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:04.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:06.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:06 np0005539509 podman[230403]: 2025-11-29 06:51:06.362251755 +0000 UTC m=+0.097543096 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:51:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:06.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:07 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:08.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:08.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:10.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:10 np0005539509 nova_compute[225815]: 2025-11-29 06:51:10.202 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:10 np0005539509 nova_compute[225815]: 2025-11-29 06:51:10.204 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:51:10 np0005539509 nova_compute[225815]: 2025-11-29 06:51:10.204 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 78.436s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:10 np0005539509 podman[230431]: 2025-11-29 06:51:10.324566774 +0000 UTC m=+0.057566199 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 01:51:10 np0005539509 podman[230430]: 2025-11-29 06:51:10.348773355 +0000 UTC m=+0.086452056 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Nov 29 01:51:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:10.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:12.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:12.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:14.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:14.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:15 np0005539509 nova_compute[225815]: 2025-11-29 06:51:15.572 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 11.62 sec#033[00m
Nov 29 01:51:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:51:15.922 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:51:15.922 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:51:15.922 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:16.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:16.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:18.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:18.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:20.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:20 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:20 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:20 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 01:51:20 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 01:51:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:20.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:22.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:22 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:22.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:51:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:51:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:24.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:24.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:25 np0005539509 nova_compute[225815]: 2025-11-29 06:51:25.334 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:25 np0005539509 nova_compute[225815]: 2025-11-29 06:51:25.335 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.003000079s ======
Nov 29 01:51:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:26.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000079s
Nov 29 01:51:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:26.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:28.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:28.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:29 np0005539509 nova_compute[225815]: 2025-11-29 06:51:29.399 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:29 np0005539509 nova_compute[225815]: 2025-11-29 06:51:29.400 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:51:29 np0005539509 nova_compute[225815]: 2025-11-29 06:51:29.400 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:51:30 np0005539509 nova_compute[225815]: 2025-11-29 06:51:30.196 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 4.62 sec#033[00m
Nov 29 01:51:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:30.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:30 np0005539509 nova_compute[225815]: 2025-11-29 06:51:30.498 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:51:30 np0005539509 nova_compute[225815]: 2025-11-29 06:51:30.499 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539509 nova_compute[225815]: 2025-11-29 06:51:30.499 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539509 nova_compute[225815]: 2025-11-29 06:51:30.499 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539509 nova_compute[225815]: 2025-11-29 06:51:30.499 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539509 nova_compute[225815]: 2025-11-29 06:51:30.499 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539509 nova_compute[225815]: 2025-11-29 06:51:30.500 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539509 nova_compute[225815]: 2025-11-29 06:51:30.500 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:51:30 np0005539509 nova_compute[225815]: 2025-11-29 06:51:30.500 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:30.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:31 np0005539509 nova_compute[225815]: 2025-11-29 06:51:31.082 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:31 np0005539509 nova_compute[225815]: 2025-11-29 06:51:31.082 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:31 np0005539509 nova_compute[225815]: 2025-11-29 06:51:31.083 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:31 np0005539509 nova_compute[225815]: 2025-11-29 06:51:31.083 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:51:31 np0005539509 nova_compute[225815]: 2025-11-29 06:51:31.083 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:31 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:51:31 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/990427680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:51:31 np0005539509 nova_compute[225815]: 2025-11-29 06:51:31.514 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:31 np0005539509 nova_compute[225815]: 2025-11-29 06:51:31.688 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:51:31 np0005539509 nova_compute[225815]: 2025-11-29 06:51:31.690 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5364MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:51:31 np0005539509 nova_compute[225815]: 2025-11-29 06:51:31.690 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:31 np0005539509 nova_compute[225815]: 2025-11-29 06:51:31.690 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:32.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:32 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:32 np0005539509 nova_compute[225815]: 2025-11-29 06:51:32.922 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:51:32 np0005539509 nova_compute[225815]: 2025-11-29 06:51:32.922 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:51:32 np0005539509 nova_compute[225815]: 2025-11-29 06:51:32.935 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:32.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:33 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:51:33 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3831186218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:51:33 np0005539509 nova_compute[225815]: 2025-11-29 06:51:33.412 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:33 np0005539509 nova_compute[225815]: 2025-11-29 06:51:33.417 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:34.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:34.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:36.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:36 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:36.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:37 np0005539509 podman[230698]: 2025-11-29 06:51:37.360045791 +0000 UTC m=+0.105349485 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 01:51:37 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:38.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:38.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:40.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:40 np0005539509 nova_compute[225815]: 2025-11-29 06:51:40.518 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:40 np0005539509 nova_compute[225815]: 2025-11-29 06:51:40.521 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:51:40 np0005539509 nova_compute[225815]: 2025-11-29 06:51:40.521 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 8.831s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:40.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:41 np0005539509 podman[230724]: 2025-11-29 06:51:41.311061597 +0000 UTC m=+0.055989328 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 29 01:51:41 np0005539509 podman[230725]: 2025-11-29 06:51:41.311080857 +0000 UTC m=+0.050999852 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:51:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:42.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:42 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:42.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:44.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:44.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:46.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:46.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:48.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:48.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:50.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:50.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:52.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:52.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:54.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:54.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:56 np0005539509 nova_compute[225815]: 2025-11-29 06:51:56.038 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 5.84 sec#033[00m
Nov 29 01:51:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:56.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:56.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:58.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:51:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:58.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:00.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:01.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:02.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:03.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:04.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:05.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:06.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:07.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:07 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:08.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:08 np0005539509 podman[230764]: 2025-11-29 06:52:08.339795988 +0000 UTC m=+0.082214332 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:52:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:09.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:10.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:11.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:12 np0005539509 podman[230792]: 2025-11-29 06:52:12.303546905 +0000 UTC m=+0.048667121 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 01:52:12 np0005539509 podman[230791]: 2025-11-29 06:52:12.303588206 +0000 UTC m=+0.050524040 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 29 01:52:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:12.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:13.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:14.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:15.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:52:15.923 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:52:15.924 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:52:15.924 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:16.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:17.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:18.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:19.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:20.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:21.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:22.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:22 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:23.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:24.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:25.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:26.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:27.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.502251) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147502348, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 2358, "num_deletes": 251, "total_data_size": 6203043, "memory_usage": 6281200, "flush_reason": "Manual Compaction"}
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147525251, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 4024087, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21065, "largest_seqno": 23418, "table_properties": {"data_size": 4014455, "index_size": 6126, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19000, "raw_average_key_size": 20, "raw_value_size": 3995455, "raw_average_value_size": 4214, "num_data_blocks": 274, "num_entries": 948, "num_filter_entries": 948, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398925, "oldest_key_time": 1764398925, "file_creation_time": 1764399147, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 23148 microseconds, and 8736 cpu microseconds.
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.525393) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 4024087 bytes OK
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.525422) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.527798) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.527816) EVENT_LOG_v1 {"time_micros": 1764399147527811, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.527841) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 6192706, prev total WAL file size 6192706, number of live WAL files 2.
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.529692) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(3929KB)], [42(7197KB)]
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147530191, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 11394800, "oldest_snapshot_seqno": -1}
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5132 keys, 9349278 bytes, temperature: kUnknown
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147597182, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 9349278, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9314499, "index_size": 20845, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12869, "raw_key_size": 128876, "raw_average_key_size": 25, "raw_value_size": 9221031, "raw_average_value_size": 1796, "num_data_blocks": 857, "num_entries": 5132, "num_filter_entries": 5132, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764399147, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.597599) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 9349278 bytes
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.599327) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 169.8 rd, 139.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 7.0 +0.0 blob) out(8.9 +0.0 blob), read-write-amplify(5.2) write-amplify(2.3) OK, records in: 5651, records dropped: 519 output_compression: NoCompression
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.599345) EVENT_LOG_v1 {"time_micros": 1764399147599335, "job": 24, "event": "compaction_finished", "compaction_time_micros": 67122, "compaction_time_cpu_micros": 24779, "output_level": 6, "num_output_files": 1, "total_output_size": 9349278, "num_input_records": 5651, "num_output_records": 5132, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147600522, "job": 24, "event": "table_file_deletion", "file_number": 44}
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147602091, "job": 24, "event": "table_file_deletion", "file_number": 42}
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.529531) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.602126) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.602131) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.602133) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.602135) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:27.602137) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:28.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:29.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.167133) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149167215, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 262, "num_deletes": 256, "total_data_size": 20532, "memory_usage": 27576, "flush_reason": "Manual Compaction"}
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149170060, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 13142, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23420, "largest_seqno": 23680, "table_properties": {"data_size": 11326, "index_size": 49, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4242, "raw_average_key_size": 16, "raw_value_size": 7889, "raw_average_value_size": 30, "num_data_blocks": 2, "num_entries": 261, "num_filter_entries": 261, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764399149, "oldest_key_time": 1764399149, "file_creation_time": 1764399149, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 2954 microseconds, and 1035 cpu microseconds.
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.170103) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 13142 bytes OK
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.170124) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.171581) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.171605) EVENT_LOG_v1 {"time_micros": 1764399149171600, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.171626) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 18466, prev total WAL file size 18466, number of live WAL files 2.
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.172389) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323534' seq:72057594037927935, type:22 .. '6C6F676D00353036' seq:0, type:0; will stop at (end)
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(12KB)], [45(9130KB)]
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149172429, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 9362420, "oldest_snapshot_seqno": -1}
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 4877 keys, 9228718 bytes, temperature: kUnknown
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149242967, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 9228718, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9195140, "index_size": 20284, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 124773, "raw_average_key_size": 25, "raw_value_size": 9105598, "raw_average_value_size": 1867, "num_data_blocks": 828, "num_entries": 4877, "num_filter_entries": 4877, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764399149, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.243218) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 9228718 bytes
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.245018) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 132.6 rd, 130.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 8.9 +0.0 blob) out(8.8 +0.0 blob), read-write-amplify(1414.6) write-amplify(702.2) OK, records in: 5393, records dropped: 516 output_compression: NoCompression
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.245037) EVENT_LOG_v1 {"time_micros": 1764399149245028, "job": 26, "event": "compaction_finished", "compaction_time_micros": 70618, "compaction_time_cpu_micros": 22970, "output_level": 6, "num_output_files": 1, "total_output_size": 9228718, "num_input_records": 5393, "num_output_records": 4877, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149245151, "job": 26, "event": "table_file_deletion", "file_number": 47}
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149246621, "job": 26, "event": "table_file_deletion", "file_number": 45}
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.171920) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.246685) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.246689) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.246691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.246692) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:29 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:52:29.246693) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:30.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:31.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:32.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:32 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:33.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:34.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:35.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:36.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:37.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:37 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:38.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:39.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:39 np0005539509 podman[230959]: 2025-11-29 06:52:39.346172084 +0000 UTC m=+0.089148469 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:52:39 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:52:39 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:52:39 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 01:52:39 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:52:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:40.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:40 np0005539509 nova_compute[225815]: 2025-11-29 06:52:40.523 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:40 np0005539509 nova_compute[225815]: 2025-11-29 06:52:40.523 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:40 np0005539509 nova_compute[225815]: 2025-11-29 06:52:40.524 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:52:40 np0005539509 nova_compute[225815]: 2025-11-29 06:52:40.524 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:52:40 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:52:40 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:52:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:41.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:42.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:42 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:43.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:43 np0005539509 podman[230986]: 2025-11-29 06:52:43.323716774 +0000 UTC m=+0.060746085 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:52:43 np0005539509 podman[230985]: 2025-11-29 06:52:43.350262798 +0000 UTC m=+0.090207697 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:52:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:44.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:45.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:46.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:47.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:52:47 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:52:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:48.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:49.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:50.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:51.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:52.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:53.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:54.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:55.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:56.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:57.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:58.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:52:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:59.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:00.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:01.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:02.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:03.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:04.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:05.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:06.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:07.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:07 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:08.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:09.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:10 np0005539509 podman[231075]: 2025-11-29 06:53:10.371160205 +0000 UTC m=+0.098103240 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:53:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:10.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:11.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:12.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:13.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:14 np0005539509 podman[231102]: 2025-11-29 06:53:14.325376576 +0000 UTC m=+0.062002039 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 29 01:53:14 np0005539509 podman[231101]: 2025-11-29 06:53:14.336390503 +0000 UTC m=+0.078995627 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 29 01:53:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:14.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:53:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:15.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:53:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:53:15.925 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:53:15.926 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:53:15.926 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:16.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:17.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:17 np0005539509 nova_compute[225815]: 2025-11-29 06:53:17.514 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 51.48 sec#033[00m
Nov 29 01:53:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:18.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:18 np0005539509 nova_compute[225815]: 2025-11-29 06:53:18.449 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:53:18 np0005539509 nova_compute[225815]: 2025-11-29 06:53:18.450 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539509 nova_compute[225815]: 2025-11-29 06:53:18.450 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539509 nova_compute[225815]: 2025-11-29 06:53:18.450 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539509 nova_compute[225815]: 2025-11-29 06:53:18.451 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539509 nova_compute[225815]: 2025-11-29 06:53:18.451 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539509 nova_compute[225815]: 2025-11-29 06:53:18.451 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539509 nova_compute[225815]: 2025-11-29 06:53:18.451 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:53:18 np0005539509 nova_compute[225815]: 2025-11-29 06:53:18.452 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:19.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:20.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:21.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:22.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:22 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:23.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:23 np0005539509 nova_compute[225815]: 2025-11-29 06:53:23.216 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:23 np0005539509 nova_compute[225815]: 2025-11-29 06:53:23.217 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:23 np0005539509 nova_compute[225815]: 2025-11-29 06:53:23.217 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:23 np0005539509 nova_compute[225815]: 2025-11-29 06:53:23.218 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:53:23 np0005539509 nova_compute[225815]: 2025-11-29 06:53:23.218 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:23 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:53:23 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/586851354' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:53:23 np0005539509 nova_compute[225815]: 2025-11-29 06:53:23.654 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:23 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:53:23 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2060863307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:53:23 np0005539509 nova_compute[225815]: 2025-11-29 06:53:23.850 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:53:23 np0005539509 nova_compute[225815]: 2025-11-29 06:53:23.851 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5353MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:53:23 np0005539509 nova_compute[225815]: 2025-11-29 06:53:23.852 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:23 np0005539509 nova_compute[225815]: 2025-11-29 06:53:23.852 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:24.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:25.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:26.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:27.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:28.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:29.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:30.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:53:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:31.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:53:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:32.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:32 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:33.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:53:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:34.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:53:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:35.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:36.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:37.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:37 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:38.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:39.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:40.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:41.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:41 np0005539509 podman[231164]: 2025-11-29 06:53:41.329427559 +0000 UTC m=+0.115122569 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:53:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:42.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:42 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:43.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:44.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:45.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:45 np0005539509 podman[231191]: 2025-11-29 06:53:45.323061329 +0000 UTC m=+0.065899854 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 01:53:45 np0005539509 podman[231192]: 2025-11-29 06:53:45.347106496 +0000 UTC m=+0.083448767 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:53:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:46.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:47.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:48.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:53:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:53:48 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:53:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:49.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:50.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:51.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:52 np0005539509 nova_compute[225815]: 2025-11-29 06:53:52.104 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:53:52 np0005539509 nova_compute[225815]: 2025-11-29 06:53:52.106 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:53:52 np0005539509 nova_compute[225815]: 2025-11-29 06:53:52.129 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:52.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:53:52 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3803659993' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:53:52 np0005539509 nova_compute[225815]: 2025-11-29 06:53:52.610 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:52 np0005539509 nova_compute[225815]: 2025-11-29 06:53:52.621 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:53.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:54 np0005539509 nova_compute[225815]: 2025-11-29 06:53:54.477 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:54 np0005539509 nova_compute[225815]: 2025-11-29 06:53:54.480 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:53:54 np0005539509 nova_compute[225815]: 2025-11-29 06:53:54.480 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 30.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:54 np0005539509 nova_compute[225815]: 2025-11-29 06:53:54.481 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:54 np0005539509 nova_compute[225815]: 2025-11-29 06:53:54.481 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:53:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:54.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:55.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:56.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:57.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:58.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:53:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:59.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:59 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:53:59 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:54:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000053s ======
Nov 29 01:54:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:00.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Nov 29 01:54:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:01.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:02.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:02 np0005539509 nova_compute[225815]: 2025-11-29 06:54:02.644 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 15.12 sec#033[00m
Nov 29 01:54:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:54:02 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/972693801' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:54:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:54:02 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/972693801' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:54:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:03 np0005539509 nova_compute[225815]: 2025-11-29 06:54:03.050 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:54:03 np0005539509 nova_compute[225815]: 2025-11-29 06:54:03.050 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:03 np0005539509 nova_compute[225815]: 2025-11-29 06:54:03.051 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:54:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:03.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:04 np0005539509 nova_compute[225815]: 2025-11-29 06:54:04.481 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:04.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:05.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:06.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:07.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:07 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:08.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:09.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:10.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:11.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:12 np0005539509 podman[231432]: 2025-11-29 06:54:12.397366202 +0000 UTC m=+0.129576247 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 01:54:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:12.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:13.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:14.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:15.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:54:15.926 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:54:15.927 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:54:15.927 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:16 np0005539509 podman[231461]: 2025-11-29 06:54:16.325504711 +0000 UTC m=+0.063280653 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:54:16 np0005539509 podman[231460]: 2025-11-29 06:54:16.356371711 +0000 UTC m=+0.097409541 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 29 01:54:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:16.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:17.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:18.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:19.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:20.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:21.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:22.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:22 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:23.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:24.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:25.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:26.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:27.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:28.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:29.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:29 np0005539509 nova_compute[225815]: 2025-11-29 06:54:29.734 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:29 np0005539509 nova_compute[225815]: 2025-11-29 06:54:29.735 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:30.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:31.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:31 np0005539509 nova_compute[225815]: 2025-11-29 06:54:31.628 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 8.98 sec#033[00m
Nov 29 01:54:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:32.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:32 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:33.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:34.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:35.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:36.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:37.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:37 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:38.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:39.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:39 np0005539509 nova_compute[225815]: 2025-11-29 06:54:39.898 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:39 np0005539509 nova_compute[225815]: 2025-11-29 06:54:39.898 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:54:39 np0005539509 nova_compute[225815]: 2025-11-29 06:54:39.899 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:54:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:40.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:41.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:42.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:42 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:43.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:43 np0005539509 podman[231498]: 2025-11-29 06:54:43.341413092 +0000 UTC m=+0.083156028 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 01:54:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:44.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:45.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:46.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:47.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:47 np0005539509 podman[231525]: 2025-11-29 06:54:47.310993227 +0000 UTC m=+0.053662794 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 29 01:54:47 np0005539509 podman[231526]: 2025-11-29 06:54:47.311004488 +0000 UTC m=+0.049112073 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:54:47 np0005539509 nova_compute[225815]: 2025-11-29 06:54:47.793 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.16 sec#033[00m
Nov 29 01:54:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:48.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:49.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:49 np0005539509 nova_compute[225815]: 2025-11-29 06:54:49.360 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:54:49 np0005539509 nova_compute[225815]: 2025-11-29 06:54:49.361 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:49 np0005539509 nova_compute[225815]: 2025-11-29 06:54:49.361 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:49 np0005539509 nova_compute[225815]: 2025-11-29 06:54:49.361 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:49 np0005539509 nova_compute[225815]: 2025-11-29 06:54:49.361 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:49 np0005539509 nova_compute[225815]: 2025-11-29 06:54:49.361 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:49 np0005539509 nova_compute[225815]: 2025-11-29 06:54:49.362 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:50.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:51.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:52.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:53.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:54.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:55.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:56.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:57.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:58.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:54:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:59.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:00.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:01.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:01 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:55:01 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:55:01 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:55:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:02.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:03.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:04.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:05.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:06.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:07.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:07 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:08.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:09.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:10.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:11.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:12.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:13 np0005539509 nova_compute[225815]: 2025-11-29 06:55:13.291 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:13 np0005539509 nova_compute[225815]: 2025-11-29 06:55:13.291 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:55:13 np0005539509 nova_compute[225815]: 2025-11-29 06:55:13.291 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:13.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:13 np0005539509 nova_compute[225815]: 2025-11-29 06:55:13.474 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 5.67 sec#033[00m
Nov 29 01:55:14 np0005539509 nova_compute[225815]: 2025-11-29 06:55:14.085 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:14 np0005539509 nova_compute[225815]: 2025-11-29 06:55:14.086 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:14 np0005539509 nova_compute[225815]: 2025-11-29 06:55:14.086 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:14 np0005539509 nova_compute[225815]: 2025-11-29 06:55:14.086 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:55:14 np0005539509 nova_compute[225815]: 2025-11-29 06:55:14.087 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:14 np0005539509 podman[231718]: 2025-11-29 06:55:14.39037315 +0000 UTC m=+0.124905681 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:55:14 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:55:14 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2860548019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:55:14 np0005539509 nova_compute[225815]: 2025-11-29 06:55:14.545 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:14.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:14 np0005539509 nova_compute[225815]: 2025-11-29 06:55:14.719 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:55:14 np0005539509 nova_compute[225815]: 2025-11-29 06:55:14.720 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5322MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:55:14 np0005539509 nova_compute[225815]: 2025-11-29 06:55:14.720 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:14 np0005539509 nova_compute[225815]: 2025-11-29 06:55:14.720 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:15.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:55:15.927 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:55:15.927 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:55:15.927 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:16 np0005539509 nova_compute[225815]: 2025-11-29 06:55:16.104 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:55:16 np0005539509 nova_compute[225815]: 2025-11-29 06:55:16.104 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:55:16 np0005539509 nova_compute[225815]: 2025-11-29 06:55:16.160 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:16 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:55:16 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/595463311' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:55:16 np0005539509 nova_compute[225815]: 2025-11-29 06:55:16.598 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:16 np0005539509 nova_compute[225815]: 2025-11-29 06:55:16.606 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:55:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:16.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:16 np0005539509 nova_compute[225815]: 2025-11-29 06:55:16.924 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:55:16 np0005539509 nova_compute[225815]: 2025-11-29 06:55:16.926 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:55:16 np0005539509 nova_compute[225815]: 2025-11-29 06:55:16.927 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:17.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:18 np0005539509 podman[231769]: 2025-11-29 06:55:18.318750921 +0000 UTC m=+0.054730405 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 01:55:18 np0005539509 podman[231768]: 2025-11-29 06:55:18.319169052 +0000 UTC m=+0.061320161 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:55:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:18.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:19.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:20.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:21.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:21 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:55:21 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:55:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:22.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:22 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:23.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:24.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:25.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:26.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:27.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:28.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:29.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:30.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:31.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:32.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:32 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:33.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:34.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:35.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:36.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:37.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:37 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:38.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:39.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:40.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:41.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:42.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:42 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:43.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:44.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:45.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:45 np0005539509 podman[231857]: 2025-11-29 06:55:45.383335912 +0000 UTC m=+0.125266131 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:55:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:46.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:47.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:48.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:49 np0005539509 podman[231884]: 2025-11-29 06:55:49.330205969 +0000 UTC m=+0.064233079 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 29 01:55:49 np0005539509 podman[231883]: 2025-11-29 06:55:49.33505049 +0000 UTC m=+0.070807586 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:55:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:49.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:50.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:51.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:52.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:53.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:54.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:55.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:56.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:57.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:58.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:55:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:59.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:00.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:01.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:02.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:03.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:04.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:05.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:05 np0005539509 nova_compute[225815]: 2025-11-29 06:56:05.493 225819 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 2.01 sec#033[00m
Nov 29 01:56:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:06.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:07.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:07 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:08.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:09.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:10.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:11.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:12.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:13.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:14.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:15.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:56:15.929 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:56:15.929 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:56:15.930 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:16 np0005539509 podman[231923]: 2025-11-29 06:56:16.366216942 +0000 UTC m=+0.099666172 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 29 01:56:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:16.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:16 np0005539509 nova_compute[225815]: 2025-11-29 06:56:16.929 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:16 np0005539509 nova_compute[225815]: 2025-11-29 06:56:16.929 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:17.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:17 np0005539509 nova_compute[225815]: 2025-11-29 06:56:17.896 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:17 np0005539509 nova_compute[225815]: 2025-11-29 06:56:17.897 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:56:17 np0005539509 nova_compute[225815]: 2025-11-29 06:56:17.897 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.295 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.296 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.296 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.297 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.297 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.297 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.298 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.298 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.298 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.499 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.499 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.499 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.500 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.500 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:18.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:18 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:56:18 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3654944032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:56:18 np0005539509 nova_compute[225815]: 2025-11-29 06:56:18.964 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:19 np0005539509 nova_compute[225815]: 2025-11-29 06:56:19.126 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:56:19 np0005539509 nova_compute[225815]: 2025-11-29 06:56:19.127 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5333MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:56:19 np0005539509 nova_compute[225815]: 2025-11-29 06:56:19.128 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:19 np0005539509 nova_compute[225815]: 2025-11-29 06:56:19.128 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.381741) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379381806, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2336, "num_deletes": 251, "total_data_size": 5965364, "memory_usage": 6051152, "flush_reason": "Manual Compaction"}
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379406971, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3916609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23685, "largest_seqno": 26016, "table_properties": {"data_size": 3907165, "index_size": 6002, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18818, "raw_average_key_size": 20, "raw_value_size": 3888452, "raw_average_value_size": 4154, "num_data_blocks": 268, "num_entries": 936, "num_filter_entries": 936, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764399149, "oldest_key_time": 1764399149, "file_creation_time": 1764399379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 25302 microseconds, and 7843 cpu microseconds.
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.407035) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3916609 bytes OK
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.407061) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.409415) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.409443) EVENT_LOG_v1 {"time_micros": 1764399379409434, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.409464) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 5955169, prev total WAL file size 5955169, number of live WAL files 2.
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.411701) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3824KB)], [48(9012KB)]
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379411880, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 13145327, "oldest_snapshot_seqno": -1}
Nov 29 01:56:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:19.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5296 keys, 11154254 bytes, temperature: kUnknown
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379520489, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 11154254, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11116401, "index_size": 23535, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13253, "raw_key_size": 134021, "raw_average_key_size": 25, "raw_value_size": 11017939, "raw_average_value_size": 2080, "num_data_blocks": 968, "num_entries": 5296, "num_filter_entries": 5296, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397161, "oldest_key_time": 0, "file_creation_time": 1764399379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a3716f2b-1e8d-44d6-a6a2-2c53b019e9a5", "db_session_id": "5Q1WIIQG9BN5XI35108Y", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.520820) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 11154254 bytes
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.522406) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 120.9 rd, 102.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 8.8 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(6.2) write-amplify(2.8) OK, records in: 5813, records dropped: 517 output_compression: NoCompression
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.522424) EVENT_LOG_v1 {"time_micros": 1764399379522414, "job": 28, "event": "compaction_finished", "compaction_time_micros": 108717, "compaction_time_cpu_micros": 31232, "output_level": 6, "num_output_files": 1, "total_output_size": 11154254, "num_input_records": 5813, "num_output_records": 5296, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379523714, "job": 28, "event": "table_file_deletion", "file_number": 50}
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379527566, "job": 28, "event": "table_file_deletion", "file_number": 48}
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.411496) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.527763) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.527771) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.527775) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.527778) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539509 ceph-mon[80754]: rocksdb: (Original Log Time 2025/11/29-06:56:19.527780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539509 nova_compute[225815]: 2025-11-29 06:56:19.733 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:56:19 np0005539509 nova_compute[225815]: 2025-11-29 06:56:19.734 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:56:19 np0005539509 nova_compute[225815]: 2025-11-29 06:56:19.747 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing inventories for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:56:19 np0005539509 nova_compute[225815]: 2025-11-29 06:56:19.977 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Updating ProviderTree inventory for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:56:19 np0005539509 nova_compute[225815]: 2025-11-29 06:56:19.977 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Updating inventory in ProviderTree for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:56:20 np0005539509 nova_compute[225815]: 2025-11-29 06:56:20.091 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing aggregate associations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:56:20 np0005539509 nova_compute[225815]: 2025-11-29 06:56:20.110 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Refreshing trait associations for resource provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSSE3,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:56:20 np0005539509 nova_compute[225815]: 2025-11-29 06:56:20.128 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:20 np0005539509 podman[231972]: 2025-11-29 06:56:20.330517528 +0000 UTC m=+0.069919892 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:56:20 np0005539509 podman[231974]: 2025-11-29 06:56:20.336330334 +0000 UTC m=+0.081998376 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 01:56:20 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:56:20 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3108214857' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:56:20 np0005539509 nova_compute[225815]: 2025-11-29 06:56:20.591 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:20 np0005539509 nova_compute[225815]: 2025-11-29 06:56:20.598 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:56:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:20.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:20 np0005539509 nova_compute[225815]: 2025-11-29 06:56:20.872 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:56:20 np0005539509 nova_compute[225815]: 2025-11-29 06:56:20.875 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:56:20 np0005539509 nova_compute[225815]: 2025-11-29 06:56:20.876 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:21.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:22.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:22 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:22 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:56:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:23.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:56:23 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:56:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:24.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:25.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:26.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:27.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:28.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:29.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:30.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:31.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:32.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:32 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:33 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:56:33 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:56:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:33.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:34.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:35.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:36.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:56:36 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1564014978' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:56:36 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:56:36 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1564014978' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:56:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:37.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:37 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:38.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:39.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:40.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:41.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:42.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:42 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:43.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:44.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:45.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:46.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:47 np0005539509 podman[232211]: 2025-11-29 06:56:47.360283506 +0000 UTC m=+0.101129332 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:56:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:47.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:48.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:49.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:50.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:51 np0005539509 podman[232239]: 2025-11-29 06:56:51.325395814 +0000 UTC m=+0.062652297 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 01:56:51 np0005539509 podman[232238]: 2025-11-29 06:56:51.337339546 +0000 UTC m=+0.069544003 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 01:56:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:51.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:52.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:53.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:54.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:55.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:56.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:57.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:58.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:56:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:59.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:00.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:01.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:02.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:03.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:04.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:05.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:06.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:07.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:07 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:08.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:09.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:10.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:11.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:12.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:13.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:14.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:15.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:57:15.929 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:57:15.929 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:57:15.930 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:16.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:17.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:18 np0005539509 podman[232277]: 2025-11-29 06:57:18.382892573 +0000 UTC m=+0.129847724 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Nov 29 01:57:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:18.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:19.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:20.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:20 np0005539509 nova_compute[225815]: 2025-11-29 06:57:20.878 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:20 np0005539509 nova_compute[225815]: 2025-11-29 06:57:20.878 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:20 np0005539509 nova_compute[225815]: 2025-11-29 06:57:20.878 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:57:20 np0005539509 nova_compute[225815]: 2025-11-29 06:57:20.879 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:57:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:21.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:22 np0005539509 podman[232304]: 2025-11-29 06:57:22.309536597 +0000 UTC m=+0.049087152 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:57:22 np0005539509 podman[232305]: 2025-11-29 06:57:22.316419062 +0000 UTC m=+0.050107819 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:57:22 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:22.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:23.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:24.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:25.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:26 np0005539509 nova_compute[225815]: 2025-11-29 06:57:26.571 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:57:26 np0005539509 nova_compute[225815]: 2025-11-29 06:57:26.571 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539509 nova_compute[225815]: 2025-11-29 06:57:26.572 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539509 nova_compute[225815]: 2025-11-29 06:57:26.572 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539509 nova_compute[225815]: 2025-11-29 06:57:26.572 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539509 nova_compute[225815]: 2025-11-29 06:57:26.572 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539509 nova_compute[225815]: 2025-11-29 06:57:26.573 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539509 nova_compute[225815]: 2025-11-29 06:57:26.573 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:57:26 np0005539509 nova_compute[225815]: 2025-11-29 06:57:26.573 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:26.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:27 np0005539509 nova_compute[225815]: 2025-11-29 06:57:27.017 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:27 np0005539509 nova_compute[225815]: 2025-11-29 06:57:27.017 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:27 np0005539509 nova_compute[225815]: 2025-11-29 06:57:27.017 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:27 np0005539509 nova_compute[225815]: 2025-11-29 06:57:27.017 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:57:27 np0005539509 nova_compute[225815]: 2025-11-29 06:57:27.018 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:57:27 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2616126904' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:57:27 np0005539509 nova_compute[225815]: 2025-11-29 06:57:27.483 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:27.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:27 np0005539509 nova_compute[225815]: 2025-11-29 06:57:27.649 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:57:27 np0005539509 nova_compute[225815]: 2025-11-29 06:57:27.650 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5352MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:57:27 np0005539509 nova_compute[225815]: 2025-11-29 06:57:27.651 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:27 np0005539509 nova_compute[225815]: 2025-11-29 06:57:27.651 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:28 np0005539509 nova_compute[225815]: 2025-11-29 06:57:28.037 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:57:28 np0005539509 nova_compute[225815]: 2025-11-29 06:57:28.037 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:57:28 np0005539509 nova_compute[225815]: 2025-11-29 06:57:28.052 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:28 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:57:28 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3901645275' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:57:28 np0005539509 nova_compute[225815]: 2025-11-29 06:57:28.495 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:28 np0005539509 nova_compute[225815]: 2025-11-29 06:57:28.502 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:28 np0005539509 nova_compute[225815]: 2025-11-29 06:57:28.551 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:28 np0005539509 nova_compute[225815]: 2025-11-29 06:57:28.553 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:57:28 np0005539509 nova_compute[225815]: 2025-11-29 06:57:28.554 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.903s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:28.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:29.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:30.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:31.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:32 np0005539509 nova_compute[225815]: 2025-11-29 06:57:32.638 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:32 np0005539509 nova_compute[225815]: 2025-11-29 06:57:32.639 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:32 np0005539509 nova_compute[225815]: 2025-11-29 06:57:32.767 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:32 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:32.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:33.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:33 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:57:33 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:57:33 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:57:34 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:34 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:34 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:34.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:35.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:36 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:36 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:36 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:36.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:37.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:37 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:38 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:38 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:38 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:38.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:39.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:40 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:40 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:40 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:40.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:41.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:42 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:42 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:42 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:42 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:42.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:43.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:44 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:44 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:44 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:44.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:45.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:46 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:46 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:46 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:46.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:47.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:48 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:48 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:48 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:48.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:49 np0005539509 podman[232520]: 2025-11-29 06:57:49.345331757 +0000 UTC m=+0.088706728 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:57:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:49.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:50 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:50 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:50 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:50.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:51.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:52 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:52 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:52 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:52.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:53 np0005539509 podman[232549]: 2025-11-29 06:57:53.323414713 +0000 UTC m=+0.056874231 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 01:57:53 np0005539509 podman[232548]: 2025-11-29 06:57:53.350467711 +0000 UTC m=+0.089248743 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:57:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:53.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:54 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:54 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:54 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:54.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:55.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:56 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:56 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:56 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:56.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:57.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:57 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:57:57 np0005539509 ceph-mon[80754]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:57:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:58 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:58 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:58 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:58.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:57:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:59.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:00 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:00 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:00 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:00.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:01.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:02 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:02 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:02 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:02.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:03.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:04 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:04 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:04 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:04.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:05 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:05 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:05 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:05.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:06 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:06 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:06 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:06.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:07 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:07 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:07 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:07.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:07 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:08 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:08 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:08 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:08.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:09 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:09 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:09 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:09.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:10 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:10 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:10 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:10.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:11 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:11 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:11 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:11.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:12 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:12 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:12 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:12 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:12.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:13 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:13 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:13 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:13.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:14 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:14 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:14 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:14.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:15 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:15 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:15 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:15.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:58:15.930 139246 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:58:15.930 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:15 np0005539509 ovn_metadata_agent[139241]: 2025-11-29 06:58:15.931 139246 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:16 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:16 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:16 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:16.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:17 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:17 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:17 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:17.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:17 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:18 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:18 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:18 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:18.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:19 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:19 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:19 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:19.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:19 np0005539509 nova_compute[225815]: 2025-11-29 06:58:19.968 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:20 np0005539509 podman[232635]: 2025-11-29 06:58:20.358653699 +0000 UTC m=+0.103573907 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller)
Nov 29 01:58:20 np0005539509 nova_compute[225815]: 2025-11-29 06:58:20.967 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:20 np0005539509 nova_compute[225815]: 2025-11-29 06:58:20.967 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:58:20 np0005539509 nova_compute[225815]: 2025-11-29 06:58:20.967 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:58:20 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:20 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:20 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:20.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:21 np0005539509 nova_compute[225815]: 2025-11-29 06:58:21.375 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:58:21 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:21 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:21 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:21.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:22 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:22 np0005539509 nova_compute[225815]: 2025-11-29 06:58:22.966 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:22 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:22 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:22 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:22.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:23 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:23 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:23 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:23.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:23 np0005539509 nova_compute[225815]: 2025-11-29 06:58:23.967 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:23 np0005539509 nova_compute[225815]: 2025-11-29 06:58:23.967 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:58:24 np0005539509 podman[232662]: 2025-11-29 06:58:24.334521175 +0000 UTC m=+0.077089144 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, tcib_managed=true)
Nov 29 01:58:24 np0005539509 podman[232663]: 2025-11-29 06:58:24.349997972 +0000 UTC m=+0.087028392 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 01:58:24 np0005539509 nova_compute[225815]: 2025-11-29 06:58:24.967 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:24 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:24 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:24 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:24.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:25 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:25 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:25 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:25.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:25 np0005539509 nova_compute[225815]: 2025-11-29 06:58:25.648 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:25 np0005539509 nova_compute[225815]: 2025-11-29 06:58:25.649 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:25 np0005539509 nova_compute[225815]: 2025-11-29 06:58:25.649 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:25 np0005539509 nova_compute[225815]: 2025-11-29 06:58:25.649 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:58:25 np0005539509 nova_compute[225815]: 2025-11-29 06:58:25.649 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:58:26 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2138341124' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.139 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.339 225819 WARNING nova.virt.libvirt.driver [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.340 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5342MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.341 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.342 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.449 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.450 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.467 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:26 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:58:26 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2021454943' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.924 225819 DEBUG oslo_concurrency.processutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.930 225819 DEBUG nova.compute.provider_tree [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed in ProviderTree for provider: 774921e7-1fd5-4281-8c90-f7cd3ee5e01b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.953 225819 DEBUG nova.scheduler.client.report [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Inventory has not changed for provider 774921e7-1fd5-4281-8c90-f7cd3ee5e01b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.955 225819 DEBUG nova.compute.resource_tracker [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.956 225819 DEBUG oslo_concurrency.lockutils [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:26 np0005539509 nova_compute[225815]: 2025-11-29 06:58:26.957 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:26 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:26 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:26 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:26.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:27 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:27 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:27 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:27.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:27 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:27 np0005539509 nova_compute[225815]: 2025-11-29 06:58:27.975 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:27 np0005539509 nova_compute[225815]: 2025-11-29 06:58:27.975 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:28 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:28 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:28 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:28.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:29 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:29 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:29 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:29.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:29 np0005539509 nova_compute[225815]: 2025-11-29 06:58:29.966 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:30 np0005539509 nova_compute[225815]: 2025-11-29 06:58:30.962 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:30 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:30 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:30 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:30.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:31 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:31 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:31 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:31.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:32 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:32 np0005539509 nova_compute[225815]: 2025-11-29 06:58:32.967 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:32 np0005539509 nova_compute[225815]: 2025-11-29 06:58:32.968 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:58:32 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:32 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:32 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:32.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:33 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:33 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:33 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:33.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:34 np0005539509 systemd-logind[785]: New session 51 of user zuul.
Nov 29 01:58:34 np0005539509 systemd[1]: Started Session 51 of User zuul.
Nov 29 01:58:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:35.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:35 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:35 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:35 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:35.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:36 np0005539509 nova_compute[225815]: 2025-11-29 06:58:36.172 225819 DEBUG oslo_service.periodic_task [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:36 np0005539509 nova_compute[225815]: 2025-11-29 06:58:36.174 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:58:36 np0005539509 nova_compute[225815]: 2025-11-29 06:58:36.197 225819 DEBUG nova.compute.manager [None req-6f1ea916-5196-4e3f-bcfb-85f848fbc2e6 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:58:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:58:36 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 7210 writes, 27K keys, 7210 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 7210 writes, 1584 syncs, 4.55 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 427 writes, 658 keys, 427 commit groups, 1.0 writes per commit group, ingest: 0.21 MB, 0.00 MB/s#012Interval WAL: 427 writes, 198 syncs, 2.16 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 01:58:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:37.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:37 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:37 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:37 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:37.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:37 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:38 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Nov 29 01:58:38 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1573460643' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 29 01:58:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:39.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:39 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:39 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:39 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:39.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:41.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:41 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:41 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.002000052s ======
Nov 29 01:58:41 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:41.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Nov 29 01:58:42 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:43.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:43 np0005539509 ovs-vsctl[233078]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 29 01:58:43 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:43 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:43 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:43.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:44 np0005539509 virtqemud[225339]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 29 01:58:44 np0005539509 virtqemud[225339]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 29 01:58:44 np0005539509 virtqemud[225339]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 29 01:58:44 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: cache status {prefix=cache status} (starting...)
Nov 29 01:58:44 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:45.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:45 np0005539509 lvm[233381]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 01:58:45 np0005539509 lvm[233381]: VG ceph_vg0 finished
Nov 29 01:58:45 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: client ls {prefix=client ls} (starting...)
Nov 29 01:58:45 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:45 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:45 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:45 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:45.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:45 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: damage ls {prefix=damage ls} (starting...)
Nov 29 01:58:45 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:45 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump loads {prefix=dump loads} (starting...)
Nov 29 01:58:45 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:45 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Nov 29 01:58:45 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1276981983' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:46 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Nov 29 01:58:46 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/135313544' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: get subtrees {prefix=get subtrees} (starting...)
Nov 29 01:58:46 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:47.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:47 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: ops {prefix=ops} (starting...)
Nov 29 01:58:47 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Nov 29 01:58:47 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1029441042' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 29 01:58:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Nov 29 01:58:47 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3853464868' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 29 01:58:47 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:47 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:47 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:47.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Nov 29 01:58:47 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1898843982' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Nov 29 01:58:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Nov 29 01:58:47 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/961495441' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 01:58:47 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: session ls {prefix=session ls} (starting...)
Nov 29 01:58:47 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad Can't run that command on an inactive MDS!
Nov 29 01:58:47 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:47 np0005539509 ceph-mds[84384]: mds.cephfs.compute-1.vlqnad asok_command: status {prefix=status} (starting...)
Nov 29 01:58:48 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Nov 29 01:58:48 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2721476524' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 01:58:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:49.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Nov 29 01:58:49 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/684679155' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 01:58:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Nov 29 01:58:49 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3910457812' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 01:58:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Nov 29 01:58:49 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1704439386' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 01:58:49 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:49 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:49 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:49.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Nov 29 01:58:49 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/532443902' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 29 01:58:49 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Nov 29 01:58:49 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4111572753' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 01:58:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Nov 29 01:58:50 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2877614875' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 29 01:58:50 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Nov 29 01:58:50 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2517383288' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 29 01:58:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:51.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:51 np0005539509 podman[234139]: 2025-11-29 06:58:51.369827739 +0000 UTC m=+0.100615568 container health_status e028f56e030d8a63077f1642a8400e2cb304564e88c4b2340abce9633877a275 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 134 pg[9.1e( v 56'1130 (0'0,56'1130] local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133) [1] r=-1 lpr=133 pi=[78,133)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1e] failed. State was: not registered w/ OSD
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69623808 unmapped: 1540096 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.10 scrub starts
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 134 heartbeat osd_stat(store_statfs(0x1bca1f000/0x0/0x1bfc00000, data 0x13d19c/0x1fe000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 909376 data_alloc: 285212672 data_used: 393216
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69632000 unmapped: 1531904 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.10 scrub ok
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 134 pg[9.1e( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133) [1] r=-1 lpr=133 DELETING pi=[78,133)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 1.245882 2 0.000271
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 134 pg[9.1e( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133) [1] r=-1 lpr=133 pi=[78,133)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.246134 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 134 pg[9.1e( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=131/132 n=5 ec=58/47 lis/c=131/78 les/c/f=132/79/0 sis=133) [1] r=-1 lpr=133 pi=[78,133)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 6.692822 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1e] failed. State was: not registered w/ OSD
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 134 heartbeat osd_stat(store_statfs(0x1bca20000/0x0/0x1bfc00000, data 0x13d19c/0x1fe000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69632000 unmapped: 1531904 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69632000 unmapped: 1531904 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69640192 unmapped: 1523712 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69640192 unmapped: 1523712 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 134 heartbeat osd_stat(store_statfs(0x1bca20000/0x0/0x1bfc00000, data 0x13d19c/0x1fe000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.11 scrub starts
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.325085640s of 12.187417030s, submitted: 25
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: log_channel(cluster) log [DBG] : 9.11 scrub ok
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 899808 data_alloc: 285212672 data_used: 393216
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69648384 unmapped: 1515520 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69648384 unmapped: 1515520 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69648384 unmapped: 1515520 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=98) [0] r=0 lpr=98 crt=56'1130 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 131.103333 122 0.000815
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=98) [0] r=0 lpr=98 crt=56'1130 mlcod 0'0 active mbc={}] exit Started/Primary/Active 131.109797 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=98) [0] r=0 lpr=98 crt=56'1130 mlcod 0'0 active mbc={}] exit Started/Primary 134.366278 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=98) [0] r=0 lpr=98 crt=56'1130 mlcod 0'0 active mbc={}] exit Started 134.366639 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=98) [0] r=0 lpr=98 crt=56'1130 mlcod 0'0 active mbc={}] enter Reset
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.898561478s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 active pruub 437.762023926s@ mbc={}] start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] exit Reset 0.001853 1 0.002531
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] enter Started
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] enter Start
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] exit Start 0.000102 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 135 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135 pruub=12.896843910s) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 437.762023926s@ mbc={}] enter Started/Stray
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 135 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69664768 unmapped: 1499136 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69664768 unmapped: 1499136 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 903726 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69664768 unmapped: 1499136 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 135 heartbeat osd_stat(store_statfs(0x1bca1c000/0x0/0x1bfc00000, data 0x13edf5/0x201000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69664768 unmapped: 1499136 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69672960 unmapped: 1490944 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 135 heartbeat osd_stat(store_statfs(0x1bca1c000/0x0/0x1bfc00000, data 0x13edf5/0x201000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69672960 unmapped: 1490944 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 5.847332 3 0.000221
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 5.847530 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=135) [1] r=-1 lpr=135 pi=[98,135)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] enter Reset
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] exit Reset 0.000124 1 0.000194
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] enter Started
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] enter Start
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] exit Start 0.000008 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] enter Started/Primary
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] enter Started/Primary/Peering
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetInfo
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000042 1 0.000051
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetLog
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetLog 0.000029 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/GetMissing
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 136 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69681152 unmapped: 1482752 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 136 heartbeat osd_stat(store_statfs(0x1bca1c000/0x0/0x1bfc00000, data 0x13edf5/0x201000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 906700 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69681152 unmapped: 1482752 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69681152 unmapped: 1482752 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69689344 unmapped: 1474560 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69689344 unmapped: 1474560 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.183051109s of 14.367411613s, submitted: 13
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering/WaitUpThru 4.780247 4 0.000078
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped+peering mbc={}] exit Started/Primary/Peering 4.780398 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=98/99 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] enter Started/Primary/Active
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 activating+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/Activating
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69689344 unmapped: 1474560 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 909674 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 137 heartbeat osd_stat(store_statfs(0x1bca16000/0x0/0x1bfc00000, data 0x1425b0/0x207000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69697536 unmapped: 1466368 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=98/98 les/c/f=99/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/Activating 1.520142 5 0.001220
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000220 1 0.000260
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000973 1 0.000066
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 0'0 active+recovery_wait+remapped mbc={255={(0+1)=5}}] enter Started/Primary/Active/Recovering
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69697536 unmapped: 1466368 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovering 1.535533 2 0.000170
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 137 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] enter Started/Primary/Active/Recovered
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69697536 unmapped: 1466368 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69705728 unmapped: 1458176 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 137 heartbeat osd_stat(store_statfs(0x1bca16000/0x0/0x1bfc00000, data 0x1425b0/0x207000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69713920 unmapped: 1449984 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 909834 data_alloc: 285212672 data_used: 405504
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69713920 unmapped: 1449984 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69713920 unmapped: 1449984 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69722112 unmapped: 1441792 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69722112 unmapped: 1441792 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] exit Started/Primary/Active/Recovered 6.626645 2 0.000147
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] exit Started/Primary/Active 9.684700 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] exit Started/Primary 14.465409 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] exit Started 14.465553 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=136) [1]/[0] async=[1] r=0 lpr=136 pi=[98,136)/1 crt=56'1130 mlcod 56'1130 active+remapped mbc={255={}}] enter Reset
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.835290909s) [1] async=[1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 56'1130 active pruub 461.014038086s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] exit Reset 0.000776 1 0.001470
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] enter Started
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] enter Start
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] exit Start 0.000092 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 138 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138 pruub=15.834728241s) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 461.014038086s@ mbc={}] enter Started/Stray
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 138 heartbeat osd_stat(store_statfs(0x1bca16000/0x0/0x1bfc00000, data 0x1425b0/0x207000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69738496 unmapped: 1425408 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: unregistering
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.235433578s of 11.264188766s, submitted: 8
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 912808 data_alloc: 285212672 data_used: 405504
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69746688 unmapped: 1417216 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69746688 unmapped: 1417216 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 138 heartbeat osd_stat(store_statfs(0x1bca13000/0x0/0x1bfc00000, data 0x1440d4/0x20a000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 138 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69746688 unmapped: 1417216 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca13000/0x0/0x1bfc00000, data 0x1440d4/0x20a000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69754880 unmapped: 1409024 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 5.380763 7 0.000523
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReseved
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReseved 0.000109 1 0.000112
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: not registered w/ OSD
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69754880 unmapped: 1409024 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 916454 data_alloc: 285212672 data_used: 405504
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69763072 unmapped: 1400832 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69763072 unmapped: 1400832 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 DELETING pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 2.720789 2 0.000394
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 2.720963 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 pg_epoch: 139 pg[9.1f( v 56'1130 (0'0,56'1130] lb MIN local-lis/les=136/137 n=5 ec=58/47 lis/c=136/98 les/c/f=137/99/0 sis=138) [1] r=-1 lpr=138 pi=[98,138)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY mbc={}] exit Started 8.101946 0 0.000000
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 scrub-queue::remove_from_osd_queue removing pg[9.1f] failed. State was: not registered w/ OSD
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69763072 unmapped: 1400832 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69763072 unmapped: 1400832 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69779456 unmapped: 1384448 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69779456 unmapped: 1384448 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69787648 unmapped: 1376256 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69795840 unmapped: 1368064 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69795840 unmapped: 1368064 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69804032 unmapped: 1359872 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69804032 unmapped: 1359872 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69804032 unmapped: 1359872 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69812224 unmapped: 1351680 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69812224 unmapped: 1351680 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69812224 unmapped: 1351680 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69820416 unmapped: 1343488 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69820416 unmapped: 1343488 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69820416 unmapped: 1343488 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69828608 unmapped: 1335296 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69828608 unmapped: 1335296 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69836800 unmapped: 1327104 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69836800 unmapped: 1327104 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69836800 unmapped: 1327104 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69844992 unmapped: 1318912 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69844992 unmapped: 1318912 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69853184 unmapped: 1310720 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69861376 unmapped: 1302528 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69861376 unmapped: 1302528 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69869568 unmapped: 1294336 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69869568 unmapped: 1294336 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69869568 unmapped: 1294336 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69877760 unmapped: 1286144 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69877760 unmapped: 1286144 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69885952 unmapped: 1277952 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69894144 unmapped: 1269760 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69902336 unmapped: 1261568 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69902336 unmapped: 1261568 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69902336 unmapped: 1261568 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69910528 unmapped: 1253376 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69910528 unmapped: 1253376 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69910528 unmapped: 1253376 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69910528 unmapped: 1253376 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69918720 unmapped: 1245184 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69918720 unmapped: 1245184 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69918720 unmapped: 1245184 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69926912 unmapped: 1236992 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69926912 unmapped: 1236992 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69935104 unmapped: 1228800 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69935104 unmapped: 1228800 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69943296 unmapped: 1220608 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69951488 unmapped: 1212416 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69951488 unmapped: 1212416 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69951488 unmapped: 1212416 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69959680 unmapped: 1204224 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69959680 unmapped: 1204224 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69967872 unmapped: 1196032 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69967872 unmapped: 1196032 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69967872 unmapped: 1196032 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69976064 unmapped: 1187840 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69976064 unmapped: 1187840 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69984256 unmapped: 1179648 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69984256 unmapped: 1179648 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69984256 unmapped: 1179648 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69992448 unmapped: 1171456 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69992448 unmapped: 1171456 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 69992448 unmapped: 1171456 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70000640 unmapped: 1163264 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70000640 unmapped: 1163264 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70008832 unmapped: 1155072 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70008832 unmapped: 1155072 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70008832 unmapped: 1155072 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70017024 unmapped: 1146880 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70017024 unmapped: 1146880 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70017024 unmapped: 1146880 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70025216 unmapped: 1138688 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70025216 unmapped: 1138688 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70033408 unmapped: 1130496 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70033408 unmapped: 1130496 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70033408 unmapped: 1130496 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70041600 unmapped: 1122304 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70041600 unmapped: 1122304 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70041600 unmapped: 1122304 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70049792 unmapped: 1114112 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70049792 unmapped: 1114112 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70057984 unmapped: 1105920 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70057984 unmapped: 1105920 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70066176 unmapped: 1097728 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70066176 unmapped: 1097728 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70066176 unmapped: 1097728 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70074368 unmapped: 1089536 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70074368 unmapped: 1089536 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70074368 unmapped: 1089536 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70082560 unmapped: 1081344 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70082560 unmapped: 1081344 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70098944 unmapped: 1064960 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70098944 unmapped: 1064960 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70107136 unmapped: 1056768 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70115328 unmapped: 1048576 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70115328 unmapped: 1048576 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70115328 unmapped: 1048576 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70123520 unmapped: 1040384 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70131712 unmapped: 1032192 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70139904 unmapped: 1024000 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70139904 unmapped: 1024000 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70148096 unmapped: 1015808 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70156288 unmapped: 1007616 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70156288 unmapped: 1007616 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70156288 unmapped: 1007616 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70164480 unmapped: 999424 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70164480 unmapped: 999424 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70164480 unmapped: 999424 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70164480 unmapped: 999424 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70172672 unmapped: 991232 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70172672 unmapped: 991232 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70180864 unmapped: 983040 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70180864 unmapped: 983040 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70180864 unmapped: 983040 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70189056 unmapped: 974848 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70189056 unmapped: 974848 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70197248 unmapped: 966656 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70197248 unmapped: 966656 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70197248 unmapped: 966656 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70205440 unmapped: 958464 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70205440 unmapped: 958464 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70205440 unmapped: 958464 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70213632 unmapped: 950272 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70213632 unmapped: 950272 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70213632 unmapped: 950272 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70221824 unmapped: 942080 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70221824 unmapped: 942080 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70230016 unmapped: 933888 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70230016 unmapped: 933888 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70230016 unmapped: 933888 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70238208 unmapped: 925696 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70246400 unmapped: 917504 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70246400 unmapped: 917504 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70246400 unmapped: 917504 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70246400 unmapped: 917504 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70262784 unmapped: 901120 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70262784 unmapped: 901120 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70270976 unmapped: 892928 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70270976 unmapped: 892928 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70270976 unmapped: 892928 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70279168 unmapped: 884736 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70279168 unmapped: 884736 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70279168 unmapped: 884736 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70287360 unmapped: 876544 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70287360 unmapped: 876544 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5840 writes, 25K keys, 5840 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5840 writes, 940 syncs, 6.21 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5840 writes, 25K keys, 5840 commit groups, 1.0 writes per commit group, ingest: 19.19 MB, 0.03 MB/s#012Interval WAL: 5840 writes, 940 syncs, 6.21 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.56 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.67844e-05%) FilterBlock(3,0.33 KB,2.00272e-05%) IndexBlock(3,0.34 KB,2.09808e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70361088 unmapped: 802816 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70361088 unmapped: 802816 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70361088 unmapped: 802816 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70369280 unmapped: 794624 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70369280 unmapped: 794624 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70369280 unmapped: 794624 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70377472 unmapped: 786432 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70377472 unmapped: 786432 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70377472 unmapped: 786432 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70385664 unmapped: 778240 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70385664 unmapped: 778240 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70393856 unmapped: 770048 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70393856 unmapped: 770048 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70393856 unmapped: 770048 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70402048 unmapped: 761856 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70402048 unmapped: 761856 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70410240 unmapped: 753664 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70410240 unmapped: 753664 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70410240 unmapped: 753664 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70418432 unmapped: 745472 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70418432 unmapped: 745472 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70426624 unmapped: 737280 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70426624 unmapped: 737280 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70426624 unmapped: 737280 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70434816 unmapped: 729088 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70434816 unmapped: 729088 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70434816 unmapped: 729088 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70443008 unmapped: 720896 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70443008 unmapped: 720896 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70443008 unmapped: 720896 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70451200 unmapped: 712704 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70451200 unmapped: 712704 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70459392 unmapped: 704512 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70459392 unmapped: 704512 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70459392 unmapped: 704512 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70475776 unmapped: 688128 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70475776 unmapped: 688128 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70475776 unmapped: 688128 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70483968 unmapped: 679936 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70483968 unmapped: 679936 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70483968 unmapped: 679936 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70492160 unmapped: 671744 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70492160 unmapped: 671744 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70500352 unmapped: 663552 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70500352 unmapped: 663552 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70500352 unmapped: 663552 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70508544 unmapped: 655360 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70508544 unmapped: 655360 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70516736 unmapped: 647168 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70516736 unmapped: 647168 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70516736 unmapped: 647168 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70524928 unmapped: 638976 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70524928 unmapped: 638976 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70524928 unmapped: 638976 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70533120 unmapped: 630784 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70533120 unmapped: 630784 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70541312 unmapped: 622592 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70541312 unmapped: 622592 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70549504 unmapped: 614400 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70549504 unmapped: 614400 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70549504 unmapped: 614400 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70557696 unmapped: 606208 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70557696 unmapped: 606208 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70565888 unmapped: 598016 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70565888 unmapped: 598016 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70565888 unmapped: 598016 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70574080 unmapped: 589824 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.04
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 3872179014 kv_alloc: 1677721600 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1610612736 meta_used: 904783 data_alloc: 285212672 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70574080 unmapped: 589824 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 5502923980 mapped: 70574080 unmapped: 589824 heap: 71163904 old mem: 3872179014 new mem: 3872179014
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: set_mon_vals no callback set
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_accepted_admin_roles = ResellerAdmin, swiftoperator: Configuration option 'rgw_keystone_accepted_admin_roles' may not be modified at runtime
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_accepted_roles = member, Member, admin: Configuration option 'rgw_keystone_accepted_roles' may not be modified at runtime
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_admin_domain = default: Configuration option 'rgw_keystone_admin_domain' may not be modified at runtime
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_admin_password = 12345678: Configuration option 'rgw_keystone_admin_password' may not be modified at runtime
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_admin_project = service: Configuration option 'rgw_keystone_admin_project' may not be modified at runtime
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_admin_user = swift: Configuration option 'rgw_keystone_admin_user' may not be modified at runtime
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_implicit_tenants = true: Configuration option 'rgw_keystone_implicit_tenants' may not be modified at runtime
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: set_mon_vals failed to set rgw_keystone_url = https://keystone-internal.openstack.svc:5000: Configuration option 'rgw_keystone_url' may not be modified at runtime
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _update_cache_settings updated pcm target: 4294967296 pcm min: 134217728 pcm max: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 589824 heap: 71163904 old mem: 3872179014 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 581632 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 581632 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 565248 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 565248 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70606848 unmapped: 557056 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70606848 unmapped: 557056 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70606848 unmapped: 557056 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 548864 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 548864 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 548864 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 532480 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 532480 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 532480 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70639616 unmapped: 524288 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70639616 unmapped: 524288 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 516096 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 516096 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 507904 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 499712 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 499712 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 491520 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 491520 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 491520 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 483328 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 483328 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 483328 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 475136 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 475136 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 466944 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 466944 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 466944 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 458752 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 458752 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 458752 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 450560 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 450560 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 442368 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 442368 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 442368 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 434176 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 434176 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 434176 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 425984 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 425984 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 425984 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 417792 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 417792 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 417792 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 409600 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 409600 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 401408 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 401408 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 401408 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 401408 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 393216 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 385024 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 295.191497803s of 297.923675537s, submitted: 5
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 376832 heap: 71163904 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [0,0,0,0,1,1,2])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 1384448 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70836224 unmapped: 1376256 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70860800 unmapped: 1351680 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904927 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 1318912 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 1302528 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 72048640 unmapped: 163840 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [0,0,0,1,0,0,0,0,2])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 65536 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [0,0,0,0,0,0,0,2])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 16384 heap: 72212480 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904855 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 999424 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 966656 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 950272 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 942080 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 933888 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 925696 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 925696 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 925696 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 917504 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 909312 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 892928 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 892928 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 892928 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 892928 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73416704 unmapped: 892928 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 884736 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 876544 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 860160 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 843776 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 835584 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 819200 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 802816 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 802816 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 802816 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 802816 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 794624 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 786432 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 786432 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 778240 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 745472 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566c8bc9000 session 0x5566c98185a0
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 745472 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 745472 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 745472 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 745472 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 729088 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73588736 unmapped: 720896 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 712704 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 712704 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 712704 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 696320 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 696320 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 696320 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 696320 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 688128 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73646080 unmapped: 663552 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 655360 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 655360 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 655360 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 655360 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 638976 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 606208 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 581632 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73695232 unmapped: 614400 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73711616 unmapped: 598016 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73711616 unmapped: 598016 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73711616 unmapped: 598016 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566ca6b0800 session 0x5566c89854a0
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 589824 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 573440 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 557056 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 557056 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 557056 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 557056 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 557056 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73760768 unmapped: 548864 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 532480 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 532480 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 532480 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 524288 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 524288 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 524288 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 524288 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73785344 unmapped: 524288 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 516096 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73793536 unmapped: 516096 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 507904 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 507904 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 507904 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 507904 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73801728 unmapped: 507904 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 499712 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 499712 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 499712 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 499712 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73809920 unmapped: 499712 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 483328 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73842688 unmapped: 466944 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73850880 unmapped: 458752 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 450560 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73875456 unmapped: 434176 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73883648 unmapped: 425984 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73883648 unmapped: 425984 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73891840 unmapped: 417792 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73908224 unmapped: 401408 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6284 writes, 25K keys, 6284 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6284 writes, 1144 syncs, 5.49 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 444 writes, 711 keys, 444 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s#012Interval WAL: 444 writes, 204 syncs, 2.18 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5566c6fdc2d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73924608 unmapped: 385024 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73940992 unmapped: 368640 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73949184 unmapped: 360448 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 344064 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 335872 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73990144 unmapped: 319488 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 73998336 unmapped: 311296 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 74006528 unmapped: 303104 heap: 74309632 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 594.295410156s of 600.553833008s, submitted: 240
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904942 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 221184 heap: 75358208 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75374592 unmapped: 1032192 heap: 76406784 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75489280 unmapped: 1966080 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 1957888 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 1957888 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 1957888 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75497472 unmapped: 1957888 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75505664 unmapped: 1949696 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75522048 unmapped: 1933312 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75522048 unmapped: 1933312 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75522048 unmapped: 1933312 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75530240 unmapped: 1925120 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75538432 unmapped: 1916928 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 1908736 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 1908736 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 1908736 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 1908736 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75546624 unmapped: 1908736 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75554816 unmapped: 1900544 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75554816 unmapped: 1900544 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75554816 unmapped: 1900544 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75554816 unmapped: 1900544 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75563008 unmapped: 1892352 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75571200 unmapped: 1884160 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75579392 unmapped: 1875968 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75579392 unmapped: 1875968 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75579392 unmapped: 1875968 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 1859584 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75612160 unmapped: 1843200 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75628544 unmapped: 1826816 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75644928 unmapped: 1810432 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75644928 unmapped: 1810432 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75653120 unmapped: 1802240 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75653120 unmapped: 1802240 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75653120 unmapped: 1802240 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75653120 unmapped: 1802240 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75661312 unmapped: 1794048 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75677696 unmapped: 1777664 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 1761280 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75702272 unmapped: 1753088 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75702272 unmapped: 1753088 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75702272 unmapped: 1753088 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75702272 unmapped: 1753088 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75702272 unmapped: 1753088 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 1744896 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 1728512 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75743232 unmapped: 1712128 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 1695744 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75767808 unmapped: 1687552 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75767808 unmapped: 1687552 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75767808 unmapped: 1687552 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 1671168 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75800576 unmapped: 1654784 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75808768 unmapped: 1646592 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 1638400 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 1638400 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 1638400 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 1638400 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 1622016 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75841536 unmapped: 1613824 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75857920 unmapped: 1597440 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75874304 unmapped: 1581056 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75890688 unmapped: 1564672 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75907072 unmapped: 1548288 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75915264 unmapped: 1540096 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75915264 unmapped: 1540096 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75915264 unmapped: 1540096 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75931648 unmapped: 1523712 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75948032 unmapped: 1507328 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 6783 writes, 26K keys, 6783 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 6783 writes, 1386 syncs, 4.89 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 499 writes, 770 keys, 499 commit groups, 1.0 writes per commit group, ingest: 0.25 MB, 0.00 MB/s#012Interval WAL: 499 writes, 242 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 75964416 unmapped: 1490944 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: mgrc ms_handle_reset ms_handle_reset con 0x5566c6fd9c00
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1221624088
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1221624088,v1:192.168.122.100:6801/1221624088]
Nov 29 01:58:51 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:51 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:51 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:51.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: mgrc handle_mgr_configure stats_period=5
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76136448 unmapped: 1318912 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76136448 unmapped: 1318912 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76136448 unmapped: 1318912 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76136448 unmapped: 1318912 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76136448 unmapped: 1318912 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566c8bb9000 session 0x5566c7db7c20
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76152832 unmapped: 1302528 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566c929a000 session 0x5566caee6780
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566c929a400 session 0x5566c7db7680
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 1294336 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 1294336 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 1294336 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 1286144 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 1269760 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 1261568 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 1245184 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 1228800 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 1212416 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 1204224 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 1204224 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 1204224 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 1187840 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 1187840 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 1187840 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 1187840 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 1163264 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 ms_handle_reset con 0x5566c8bc9000 session 0x5566cb593e00
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 1155072 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 599.411193848s of 600.292236328s, submitted: 257
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [0,0,0,0,0,0,0,0,0,4,2])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 1179648 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 1114112 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [0,0,0,1])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76488704 unmapped: 966656 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 876544 heap: 77455360 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77660160 unmapped: 1892352 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77717504 unmapped: 1835008 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 1826816 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 1818624 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 1810432 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 1810432 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77750272 unmapped: 1802240 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77750272 unmapped: 1802240 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77750272 unmapped: 1802240 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77758464 unmapped: 1794048 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77766656 unmapped: 1785856 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 1777664 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77783040 unmapped: 1769472 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 1761280 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 1753088 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 1744896 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 1736704 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 7210 writes, 27K keys, 7210 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 7210 writes, 1584 syncs, 4.55 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 427 writes, 658 keys, 427 commit groups, 1.0 writes per commit group, ingest: 0.21 MB, 0.00 MB/s#012Interval WAL: 427 writes, 198 syncs, 2.16 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 1728512 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 1540096 heap: 79552512 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: do_command 'config diff' '{prefix=config diff}'
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: do_command 'config show' '{prefix=config show}'
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: do_command 'counter dump' '{prefix=counter dump}'
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: do_command 'counter schema' '{prefix=counter schema}'
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 78512128 unmapped: 2088960 heap: 80601088 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 1982464 heap: 80601088 old mem: 2845415833 new mem: 2845415833
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: osd.0 139 heartbeat osd_stat(store_statfs(0x1bca12000/0x0/0x1bfc00000, data 0x145b75/0x20c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2fdf9c7), peers [1,2] op hist [])
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: bluestore.MempoolThread(0x5566c70bbb60) _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904783 data_alloc: 218103808 data_used: 401408
Nov 29 01:58:51 np0005539509 ceph-osd[78089]: do_command 'log dump' '{prefix=log dump}'
Nov 29 01:58:52 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:53.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:53 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:53 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:53 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:53.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:53 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Nov 29 01:58:53 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1233692885' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 01:58:53 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Nov 29 01:58:53 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3946325939' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 29 01:58:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Nov 29 01:58:54 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3661981742' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 01:58:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Nov 29 01:58:54 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3635354810' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 01:58:54 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Nov 29 01:58:54 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3429457872' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 01:58:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:55.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:55 np0005539509 podman[234545]: 2025-11-29 06:58:55.448766129 +0000 UTC m=+0.075202774 container health_status b24fc6b0bbead758d2eed1911b0e12399d60965de5df1dc2cd7f02e0cc0358f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:58:55 np0005539509 podman[234544]: 2025-11-29 06:58:55.459183479 +0000 UTC m=+0.082366326 container health_status 28ca3742468a7e0b0437d156af59c0b04c47b23d2e8b58e7e858692e37f15834 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 01:58:55 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:55 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:55 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:55.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:55 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Nov 29 01:58:55 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1183724430' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 01:58:56 np0005539509 systemd[1]: Starting Hostname Service...
Nov 29 01:58:56 np0005539509 systemd[1]: Started Hostname Service.
Nov 29 01:58:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:57.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:57 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:57 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:57 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:57.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:57 np0005539509 podman[234983]: 2025-11-29 06:58:57.858548207 +0000 UTC m=+0.096372834 container exec 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Nov 29 01:58:57 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:57 np0005539509 podman[234983]: 2025-11-29 06:58:57.957652022 +0000 UTC m=+0.195476629 container exec_died 4384fb97959c63d88ac36dfbdc7349f1aeb1412e448d2b6167ef165a76e0bdb6 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-1, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 01:58:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:59.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Nov 29 01:58:59 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3562221273' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Nov 29 01:58:59 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Nov 29 01:58:59 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2432053198' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 01:58:59 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:58:59 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:59 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:59.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:59:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Nov 29 01:59:00 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1177386484' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 01:59:00 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Nov 29 01:59:00 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/255758300' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Nov 29 01:59:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:59:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:59:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:59:01.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:59:01 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:59:01 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:59:01 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:59:01.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:59:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Nov 29 01:59:02 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/269676052' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Nov 29 01:59:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Nov 29 01:59:02 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1938043130' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Nov 29 01:59:02 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:59:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:59:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:59:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.102 - anonymous [29/Nov/2025:06:59:03.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:59:03 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Nov 29 01:59:03 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4256520886' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Nov 29 01:59:03 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Nov 29 01:59:03 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/585662946' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Nov 29 01:59:03 np0005539509 ceph-mon[80754]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Nov 29 01:59:03 np0005539509 ceph-mon[80754]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1903066086' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Nov 29 01:59:03 np0005539509 radosgw[83442]: ====== starting new request req=0x7f2d48ef86f0 =====
Nov 29 01:59:03 np0005539509 radosgw[83442]: ====== req done req=0x7f2d48ef86f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:59:03 np0005539509 radosgw[83442]: beast: 0x7f2d48ef86f0: 192.168.122.100 - anonymous [29/Nov/2025:06:59:03.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
